Corey Parker

Corey Parker

An experienced research analyst with strong background in financial markets. My core competency is in data analysis, business intelligence and following of financial market trends. An online research has been my passion for past many years.

Wednesday, 15 November 2017 11:16

Do you trust that news?

In its ongoing efforts to address the scourge of misleading and false news, Google recently announced a new feature that helps readers evaluate a news source they may not be familiar with. Now,  when you search for a particular publication, the Knowledge Panel – that preformatted answers box that often appears at the top of search results – includes information about that publisher.

advance online research methods librarians

Depending on the publication, that can include awards they have won, the topics they cover most extensively and their political alignment. If content from the publication has recently been reviewed by an authoritative fact-checker, those items are also featured in the Knowledge Panel. [UPDATED: This seems to work in Google Chrome and Safari, but not Firefox. Thanks, Pam Wren, for the heads up!]

So, for example, if you Google “Wall Street Journal”, your search results page will include a Knowledge Panel like this:

wsj

You’ll see a one-sentence blurb from the Wikipedia article about the newspaper, links to professional awards for reporting, and a summary of the topics they have recently covered — in the case of the Wall Street Journal, that’s the Federal Reserve, advertising, sales and taxes… about right for a newspaper described as business-focused.

And if you Google “Breitbart”, your search results page will include a Knowledge Panel like this:

breitbart

If you click the link for “Writes About”, you’ll see that Breitbart has recently covered Donald Trump, Barack Obama, the Republican Party and Hillary Clinton… what you might expect from what the Wikipedia article describes as a “far-right American news, opinion and commentary website”. But note the “Reviewed Claims” tab, highlighting reported facts that were then determined to be false by fact-checkers like SnopesPolitifact and FactCheck. This stands out as a concern — most news sources’ Knowledge Panels don’t include lists of reported facts that were questioned and reviewed by fact-checking sites.

This is a great way for librarians and information professionals to instill a little FUD (Fear, Uncertainty and Doubt) when their clients assume that whatever they see on their Facebook feed is reliable. And check out Vanessa Otero’s infographic, What, Exactly, Are We Reading?,  a nice chart of where various media sources fall, both in terms of reliability/fabrication and liberal/conservative.

 

Source: This article was published reluctant-entrepreneur.com

In their book Write Your Business Plan, the staff of Entrepreneur Media, Inc. offer an in-depth understanding of what’s essential to any business plan, what’s appropriate for your venture, and what it takes to ensure success. In this edited excerpt, the authors discuss the whys and hows of conducting market research.

Market research aims to understand the reasons consumers will buy your product. It studies such things as consumer behavior, including how cultural, societal and personal factors influence that behavior.

Market research is further split into two varieties: primary and secondary. Primary research studies customers directly, whereas secondary research studies information that others have gathered about customers. Primary research might be telephone interviews or online polls with randomly selected members of the target group. You can also study your own sales records to gather primary research. Secondary research might come from reports found on the websites of various other organizations or blogs written about the industry. For your plan, you can use either type of research or a combination of both.

The basic questions you’ll try to answer with your market research include:

Who are your customers? Describe them in terms of age, occupation, income, lifestyle, educational attainment, etc.

What do they buy now? Describe their buying habits relating to your product or service, including how much they buy, their favored suppliers, the most popular features and the predominant price points.

Why do they buy? This is the tricky one, attempting as it does to delve into consumers’ heads. Answers will depend on the product and its uses. Cookware buyers may buy the products that offer the most effective nonstick surfaces, or those that give the most pans in a package for a given amount of money, or those that come in the most decorative colors.

What will make them buy from you? Although some of these questions may seem difficult, you’d be surprised at the detailed information that's available about markets, sales figures and consumer buying motivations. Tapping information sources to provide the answers to as many questions as you can will make your plan more convincing and your odds of success higher. Also, the business plan software programs have detailed research included and online research available. Utilize this functionality if you're using such software, and add additional data you find elsewhere. The reason to add some of your own unique material is that everyone using the software program is tapping into the same database and you want your business plan to differ from that of the last entrepreneur in your field.

You can also find companies that will sell you everything from industry studies to credit reports on individual companies. Market research isn't cheap. It requires significant amounts of expertise, manpower and technology to develop solid research. Large companies routinely spend tens of thousands of dollars researching things they ultimately decide they’re not interested in. Smaller firms can’t afford to do that too often.

For companies of all sizes, the best market research is the research you do on your own. In-house market research might take the form of original telephone interviews with consumers, customized crunching of numbers from published sources or perhaps competitive intelligence you’ve gathered on your rivals through the social media. You can gather detailed research on customers, including their likes, dislikes and preferences, through Facebook, and use Google Analytics to sort out the numbers as they pertain to your web visitors. People are researching and making their opinions felt through their actions on the web, so you can gain a lot of marketing insight by looking closely at what is going on electronically.

You'll also want to do your due diligence within your industry. When looking at comparable businesses (and their data), find a close match. For comparative purposes, consider:

1. Companies of relative size

2. Companies serving the same geographic area, which could be global if you are planning to be a web-based business

3. Companies with a similar ownership structure. If your business has two partners, look for businesses run by a couple of partners rather than an advisory board of 12.

4. Companies that are relatively new. While you can learn from long-standing businesses, they may be successful today because of their 25-year business history and reputation.

You'll want to use the data you've gathered not only to determine how much business you could possibly do but also to figure out how you'll fit into and adapt to the marketplace.

Follow these steps to spending your market research dollars wisely:

1. Determine what you need to know about your market. The more focused the research, the more valuable it will be.

2. Prioritize the results of the first step. You can’t research everything, so concentrate on the information that will give you the best (or quickest) payback.

3. Review less-expensive research alternatives. Small Business Development Centers and the Small Business Administration can help you develop customer surveys. Your trade association will have good secondary research. Be creative.

4. Estimate the cost of performing the research yourself. Keep in mind that with the internet you should not have to spend a ton of money. If you’re considering hiring a consultant or a researcher, remember this is your dream, these are your goals, and this is your business. Don’t pay for what you don’t need.

Source: This article was published entrepreneur.com

Saturday, 16 September 2017 14:02

Google Reveals Most Popular ‘How To’ Searches

Site owners looking to capitalize on “how to” searches now have an all new source of data at their fingertips.

In addition to highlighting the most popular “how to” searches in a recent blog post, Google has also launched a new site that visualizes this data.

Searches that begin with “how to” are on the rise, growing by 140% in the last 13 years. Among those searches, people looking for help with ‘how to fix’ things are especially popular.

The Google Trends team has even broken down this data by geographic location.

North Americans are most concerned with how to fix their toilets, people in warm climates need the most help with fixing their fridge, and North and Eastern Europeans are most interested in how to fix a light bulb.

Google’s new ‘how to fix’ site lets users select from a drop down list of countries and see the most popular searches in their area.

The site also breaks down the most popular:

  • Cooking queries
  • Love-related queries
  • Coming-of-age queries
  • Difficult/technical queries
  • Health queries

There’s also brief mention of the top viral queries, with the most viral ‘how to’ query of the moment being “how to make slime.”

Google has compiled all this data to reveal the overall top ‘how to’ searches worldwide:

  1. how to tie a tie
  2. how to kiss
  3. how to get pregnant
  4. how to lose weight
  5. how to draw
  6. how to make money
  7. how to make pancakes
  8. how to write a cover letter
  9. how to make french toast
  10. how to lose belly fat

If you’re hungry for more data, Google has uploaded the top ‘how to’ searches from 2004 – 2017 on it’s GitHub page.

Source: This article was published searchenginejournal.com By Matt Southern

Tuesday, 05 September 2017 11:39

Who Owns the Internet?

On the night of November 7, 1876, Rutherford B. Hayes’s wife, Lucy, took to her bed with a headache. The returns from the Presidential election were trickling in, and the Hayeses, who had been spending the evening in their parlor, in Columbus, Ohio, were dismayed. Hayes himself remained up until midnight; then he, too, retired, convinced that his Democratic opponent, Samuel J. Tilden, would become the next President.

Hayes had indeed lost the popular vote, by more than two hundred and fifty thousand ballots. And he might have lost the Electoral College as well had it not been for the machinations of journalists working in the shady corners of what’s been called “the Victorian Internet.”

Chief among the plotters was an Ohioan named William Henry Smith. Smith ran the western arm of the Associated Press, and in this way controlled the bulk of the copy that ran in many small-town newspapers. The Western A.P. operated in tight affiliation—some would say collusion—with Western Union, which exercised a near-monopoly over the nation’s telegraph lines. Early in the campaign, Smith decided that he would employ any means necessary to assure a victory for Hayes, who, at the time, was serving a third term as Ohio’s governor. In the run-up to the Republican National Convention, Smith orchestrated the release of damaging information about the Governor’s rivals. Then he had the Western A.P. blare Hayes’s campaign statements and mute Tilden’s. At one point, an unflattering piece about Hayes appeared in the Chicago Times, a Democratic paper. (The piece claimed that Hayes, who had been a general in the Union Army, had accepted money from a soldier to give to the man’s family, but had failed to pass it on when the soldier died.) The A.P. flooded the wires with articles discrediting the story.

Once the votes had been counted, attention shifted to South Carolina, Florida, and Louisiana—states where the results were disputed. Both parties dispatched emissaries to the three states to try to influence the Electoral College outcome. Telegrams sent by Tilden’s representatives were passed on to Smith, courtesy of Western Union. Smith, in turn, shared the contents of these dispatches with the Hayes forces. This proto-hack of the Democrats’ private communications gave the Republicans an obvious edge. Meanwhile, the A.P. sought and distributed legal opinions supporting Hayes. (Outraged Tilden supporters took to calling it the “Hayesociated Press.”) As Democrats watched what they considered to be the theft of the election, they fell into a funk.

“They are full of passion and want to do something desperate but hardly know how to,” one observer noted. Two days before Hayes was inaugurated, on March 5, 1877, the New York Sun appeared with a black border on the front page. “These are days of humiliation, shame and mourning for every patriotic American,” the paper’s editor wrote.

History, Mark Twain is supposed to have said, doesn’t repeat itself, but it does rhyme. Once again, the President of the United States is a Republican who lost the popular vote. Once again, he was abetted by shadowy agents who manipulated the news. And once again Democrats are in a finger-pointing funk.

Journalists, congressional committees, and a special counsel are probing the details of what happened last fall. But two new books contend that the large lines of the problem are already clear. As in the eighteen-seventies, we are in the midst of a technological revolution that has altered the flow of information. Now, as then, just a few companies have taken control, and this concentration of power—which Americans have acquiesced to without ever really intending to, simply by clicking away—is subverting our democracy.

Thirty years ago, almost no one used the Internet for anything. Today, just about everybody uses it for everything. Even as the Web has grown, however, it has narrowed. Google now controls nearly ninety per cent of search advertising, Facebook almost eighty per cent of mobile social traffic, and Amazon about seventy-five per cent of e-book sales. Such dominance, Jonathan Taplin argues, in “Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy” (Little, Brown), is essentially monopolistic. In his account, the new monopolies are even more powerful than the old ones, which tended to be limited to a single product or service. Carnegie, Taplin suggests, would have been envious of the reach of Mark Zuckerberg and Jeff Bezos.

Taplin, who until recently directed the Annenberg Innovation Lab, at the University of Southern California, started out as a tour manager. He worked with Judy Collins, Bob Dylan, and the Band, and also with George Harrison, on the Concert for Bangladesh. In “Move Fast and Break Things,” Taplin draws extensively on this experience to illustrate the damage, both deliberate and collateral, that Big Tech is wreaking.

Consider the case of Levon Helm. He was the drummer for the Band, and, though he never got rich off his music, well into middle age he was supported by royalties. In 1999, he was diagnosed with throat cancer. That same year, Napster came along, followed by YouTube, in 2005. Helm’s royalty income, which had run to about a hundred thousand dollars a year, according to Taplin, dropped “to almost nothing.” When Helm died, in 2012, millions of people were still listening to the Band’s music, but hardly any of them were paying for it. (In the years between the founding of Napster and Helm’s death, total consumer spending on recorded music in the United States dropped by roughly seventy per cent.) Friends had to stage a benefit for Helm’s widow so that she could hold on to their house.

Google entered and more or less immediately took over the music business when it acquired YouTube, in 2006, for $1.65 billion in stock. As Taplin notes, just about “every single tune in the world is available on YouTube as a simple audio file (most of them posted by users).” Many of these files are illegal, but to Google this is inconsequential. Under the Digital Media Copyright Act, signed into law by President Bill Clinton shortly after Google went live, Internet service providers aren’t liable for copyright infringement as long as they “expeditiously” take down or block access to the material once they’re notified of a problem. Musicians are constantly filing “takedown” notices—in just the first twelve weeks of last year, Google received such notices for more than two hundred million links—but, often, after one link is taken down, the song goes right back up at another one. In the fall of 2011, legislation aimed at curbing online copyright infringement, the Stop Online Piracy Act, was introduced. It had bipartisan support in Congress, and backing from such disparate groups as the National District Attorneys Association, the National League of Cities, the Association of Talent Agencies, and the International Brotherhood of Teamsters. In January, 2012, the bill seemed headed toward passage, when Google decided to flex its market-concentrated muscles. In place of its usual colorful logo, the company posted on its search page a black rectangle along with the message “Tell Congress: Please don’t censor the web!” The resulting traffic overwhelmed congressional Web sites, and support for the bill evaporated. (Senator Marco Rubio, of Florida, who had been one of the bill’s co-sponsors, denounced it on Facebook.)

Google itself doesn’t pirate music; it doesn’t have to. It’s selling the traffic—and, just as significant, the data about the traffic. Like the Koch brothers, Taplin observes, Google is “in the extraction industry.” Its business model is “to extract as much personal data from as many people in the world at the lowest possible price and to resell that data to as many companies as possible at the highest possible price.” And so Google profits from just about everything: cat videos, beheadings, alt-right rants, the Band performing “The Weight” at Woodstock, in 1969.

“I wasn’t always so skeptical,” Franklin Foer announces at the start of “World Without Mind: The Existential Threat of Big Tech” (Penguin Press). Franklin, the eldest of the three famous Foer brothers, is a journalist, and he began his career, in the mid-nineties, working for Slate, which had then just been founded by Microsoft. The experience, Foer writes, was “exhilarating.” Later, he became the editor of The New Republic. The magazine was on the brink of ruin when, in 2012, it was purchased by Chris Hughes, a co-founder of Facebook, whose personal fortune was estimated at half a billion dollars.

Foer saw Hughes as a “savior,” who could provide, in addition to cash, “an insider’s knowledge of social media” and “a millennial imprimatur.” The two men set out to revitalize the magazine, hiring high-priced talent and redesigning the Web site. Foer recounts that he became so consumed with monitoring traffic to the magazine’s site, using a tool called Chartbeat, that he checked it even while standing at the urinal.

The era of good feeling didn’t last. In the fall of 2014, Foer heard that Hughes had hired someone to replace him, and that this shadow editor was “lunching around New York offering jobs at The New Republic.” Before Hughes had a chance to fire him, Foer quit, and most of the magazine’s editorial staff left with him. “World Without Mind” is a reflection on Foer’s experiences and on the larger forces reshaping American arts and letters, or what’s nowadays often called “content.”

“I hope this book doesn’t come across as fueled by anger, but I don’t want to deny my anger either,” he writes. “The tech companies are destroying something precious. . . . They have eroded the integrity of institutions—media, publishing—that supply the intellectual material that provokes thought and guides democracy. Their most precious asset is our most precious asset, our attention, and they have abused it.”

Much of Foer’s anger, like Taplin’s, is directed at piracy. “Once an underground, amateur pastime,” he writes, “the bootlegging of intellectual property” has become “an accepted business practice.” He points to the Huffington Post, since shortened to HuffPost, which rose to prominence largely by aggregating—or, if you prefer, pilfering—content from publications like the Times and the Washington Post. Then there’s Google Books. Google set out to scan every book in creation and make the volumes available online, without bothering to consult the copyright holders. (The project has been hobbled by lawsuits.) Newspapers and magazines (including this one) have tried to disrupt the disrupters by placing articles behind paywalls, but, Foer contends, in the contest against Big Tech publishers can’t win; the lineup is too lopsided. “When newspapers and magazines require subscriptions to access their pieces, Google and Facebook tend to bury them,” he writes. “Articles protected by stringent paywalls almost never have the popularity that algorithms reward with prominence.”

Foer acknowledges that prominence and popularity have always mattered in publishing. In every generation, the primary business of journalism has been to stay in business. In the nineteen-eighties, Dick Stolley, the founding editor of People, developed what might be thought of as an algorithm for the pre-digital age. It was a formula for picking cover images, and it ran as follows: Young is better than old. Pretty is better than ugly. Rich is better than poor. Movies are better than music. Music is better than television. Television is better than sports. And anything is better than politics.

But Stolley’s Law is to Chartbeat what a Boy Scout’s compass is to G.P.S. It is now possible to determine not just which covers sell magazines but which articles are getting the most traction, who’s e-mailing and tweeting them, and how long individual readers are sticking with them before clicking away. This sort of detailed information, combined with the pressure to generate traffic, has resulted in what Foer sees as a golden age of banality. He cites the “memorable yet utterly forgettable example” of Cecil the lion. In 2015, Cecil was shot with an arrow outside Hwange National Park, in Zimbabwe, by a dentist from Minnesota. For whatever reason, the killing went viral and, according to Foer, “every news organization” (including, once again, this one) rushed to get in on the story, “so it could scrape some traffic from it.” He lists with evident scorn the titles of posts from Vox—“Eating Chicken Is Morally Worse Than Killing Cecil the Lion”—and The Atlantic’s Web site: “From Cecil the Lion to Climate Change: A Perfect Storm of Outrage.” (In July, Cecil’s son, Xanda, was shot, prompting another digital outpouring.)

Donald Trump, Foer argues, represents “the culmination” of this trend. In the lead-up to the campaign, Trump’s politics, such as they were, consisted of empty and outrageous claims. Although none deserved to be taken seriously, many had that coveted viral something. Trump’s utterances as a candidate were equally appalling, but on the Internet apparently nobody knows you’re a demagogue. “Trump began as Cecil the Lion, and then ended up president of the United States,” Foer writes.

Both Taplin and Foer begin their books with a discussion of the early days of personal computers, when the Web was still a Pynchonesque fantasy and lots of smart people believed that connecting the world’s PCs would lead to a more peaceful, just, and groovy society. Both cite Stewart Brand, who, after hanging out with Ken Kesey, dropping a lot of acid, and editing “The Whole Earth Catalog,” went on to create one of the first virtual networks, the Whole Earth ’Lectronic Link, otherwise known as well.

In an influential piece that appeared in Rolling Stone in 1972, Brand prophesied that, when computers became widely available, everyone would become a “computer bum” and “more empowered as individuals and co-operators.” This, he further predicted, could enhance “the richness and rigor of spontaneous creation and human interaction.” No longer would it be the editors at the Timesand the Washington Post and the producers at CBS News who decided what the public did (or didn’t) learn. No longer would the suits at the entertainment companies determine what the public did (or didn’t) hear.

“The Internet was supposed to be a boon for artists,” Taplin observes. “It was supposed to eliminate the ‘gatekeepers’—the big studios and record companies that decide which movies and music get widespread distribution.” Silicon Valley, Foer writes, was supposed to be a liberating force—“the disruptive agent that shatters the grip of the sclerotic, self-perpetuating mediocrity that constitutes the American elite.”

The Internet revolution has, indeed, sent heads rolling, as legions of bookstore owners, music critics, and cirrhotic editors can attest. But Brand’s dream, Taplin and Foer argue, has not been realized. Google, Amazon, Facebook, and Apple—Europeans refer to the group simply as gafa—didn’t eliminate the gatekeepers; they took their place. Instead of becoming more egalitarian, the country has become less so: the gap between America’s rich and poor grows ever wider. Meanwhile, politically, the nation has lurched to the right. In Foer’s telling, it would be a lot easier to fix an election these days than it was in 1876, and a lot harder for anyone to know about it. All the Big Tech firms would have to do is tinker with some algorithms. They have become, Foer writes, “the most imposing gatekeepers in human history.”

This is a simple, satisfying narrative, and it allows Taplin and Foer to focus their ire on GAFA gazillionaires, like Zuckerberg and Larry Page. But, as an account of the “unpresidented” world in which we live, it seems to miss the point. Say what you will about Silicon Valley, most of its major players backed Hillary Clinton. This is confirmed by campaign-finance filings and, as it happens, by the Russian hack of Democratic National Committee e-mails. “I hope you are well—thinking of all of you often and following every move!” Facebook’s chief operating officer, Sheryl Sandberg, wrote to Clinton’s campaign chairman, John Podesta, at one point.

It is troubling that Facebook, Google, and Amazon have managed to grab for themselves such a large share of online revenue while relying on content created by others. Quite possibly, it is also anti-competitive. Still, it seems a stretch to blame gafa for the popularity of listicles or fake news.

Last fall, some Times reporters went looking for the source of a stream of largely fabricated pro-Trump stories that had run on a Web site called Departed. They traced them to a twenty-two-year-old computer-science student in Tbilisi named Beqa Latsabidze. He told the Times that he had begun the election season by pumping out flattering stories about Hillary Clinton, but the site hadn’t generated much interest. When he switched to pro-Trump nonsense, traffic had soared, and so had the site’s revenues. “For me, this is all about income,” Latsabidze said. Perhaps the real problem is not that Brand’s prophecy failed but that it came true. A “computer bum” sitting in Tbilisi is now so “empowered” as an individual that he can help turn an election halfway around the world.

Either out of conviction or simply out of habit, the gatekeepers of yore set a certain tone. They waved through news about state budget deficits and arms-control talks, while impeding the flow of loony conspiracy theories. Now Chartbeat allows everyone to see just how many (or, more to the point, how few) readers there really are for that report on the drought in South Sudan or that article on monopoly power and the Internet. And so it follows that there will be fewer such reports and fewer such articles. The Web is designed to give people what they want, which, for better or worse, is also the function of democracy.

Post-Cecil, post-fact, and mid-Trump, is there anything to be done? Taplin proposes a few fixes. To start, he wants the federal government to treat companies like Google and Facebook as monopolies and regulate them accordingly. (Relying on similar thinking, regulators in the European Union recently slapped Google with a $2.7-billion fine.)

Taplin notes that, in the late nineteen-forties, the U.S. Department of Justice went after A.T. & T., the Google of its day, for violating the Sherman Antitrust Act. The consent decree in the case, signed in 1956, compelled A.T. & T. to license all the patents owned by its research arm, Bell Labs, for a small fee. (One of the technologies affected by the decree was the transistor, which later proved essential to computers.) Google, he argues, could be similarly compelled to license its thousands of patents, including those for search algorithms, cell-phone operating systems, self-driving cars, smart thermostats, advertising exchanges, and virtual-reality platforms.

“It would seem that such a licensing program would be totally in line with Google’s stated ‘Don’t be evil’ corporate philosophy,” Taplin writes. At the same time, he urges musicians and filmmakers to take matters into their own hands by establishing their own distribution networks, along the lines of Magnum Photos, formed by Robert Capa, Henri Cartier-Bresson, and others in 1947.

“What if artists ran a video and audio streaming site as a nonprofit cooperative (perhaps employing the technology in some of those free Google patents)?” he asks at one point. “I have no illusion that the existing business structures of cultural marketing will go away,” he observes at another. “But my hope is that we can build a parallel structure that will benefit all creators.”

Foer prefers the model of artisanal cheesemakers. ( “World Without Mind” apparently went to press before Amazon announced its intention to buy Whole Foods.) “The culture industries need to present themselves as the organic alternative, a symbol of status and aspiration,” he writes. “Subscriptions are the route away from the aisles of clickbait.” Just after the election, he notes, the Times added more than a hundred thousand new subscribers by marketing itself as a fake-news antidote. And, as an act of personal resistance, he suggests picking up a book. “If the tech companies hope to absorb the totality of human existence,” he writes, “then reading on paper is one of the few slivers of life that they can’t fully integrate.”

These remedies are all backward-looking. They take as a point of reference a world that has vanished, or is about to. (If Amazon has its way, even artisanal cheese will soon be delivered by drone.) Depending on how you look at things, this is either a strange place for meditations about the future to end up or a predictable one. People who worry about the fate of democracy still write (and read) books. Those who are determining it prefer to tweet. ♦

This article appears in other versions of the August 28, 2017, issue, with the headline “The Content of No Content.”

Source: This article was published newyorker.com By Elizabeth Kolbert

Microsoft has created a new research lab with a focus on developing general-purpose artificial intelligence technology, the company revealed today. The lab will be located at Microsoft’s Redmond HQ, and will include a team of more than 100 scientists working on AI, from areas including natural language processing, learning and perception systems.

The aim of building a general-purpose AI that can effectively address problems in a range of different areas, rather than focusing on a single specific task, is one that many leading technology companies are pursuing. Most notably, perhaps, Google is attempting to tackle the challenge of more generalized AI via both its own Google Brain project and through efforts at DeepMind, the company it acquired in 2014, which is now its own subsidiary under mutual parent company Alphabet.

Microsoft’s new endeavor is called Microsoft Research AI, and it’ll pull from existing AI expertise at the company, as well as pursue new hires, including experts in related fields such as cognitive psychology to flesh out the team, Bloomberg says. The lab will also formally partner with MIT’s Center for Brains, Minds and Machines. Seeking academic-private tie-ups is not at all unusual in AI development — Microsoft, Google and others, including Uber, have made commitments to academic institutions in order to help secure talent and pipeline for students with related expertise.

In addition to the research lab, Microsoft is going to create an AI ethics oversight panel that will act in an advisory capacity across the company, which is also very much in keeping with industry trends. Microsoft previously signed on to work with DeepMind, Amazon, Google, Facebook and IBM on a cross-company partnership for ethical AI development, and Google and DeepMind also have their own AI ethics board.

Source: This article was published techcrunch.com By Darrell Etherington

Google is killing the 'Google Now' name but improving the underlying functionality to make it more controllable, engaging -- and searchable.

Google Now was launched at Google I/O in June 2012. It was part of a package of updates and UI changes for mobile search, which included a female-voiced mobile assistant to compete with Apple’s Siri.

Google Now was initially a way to get contextually relevant information based on location, time of day and your calendar. It evolved to become much more sophisticated and elaborate, with a wide array of content categories delivered on cards. For a time it was being called “predictive search,” although that term has faded.

Now was billed as a way to get information on your smartphone without actively searching for it. It was heralded by some as the future of mobile search.

The ‘feed experience’ improves

Today Google is officially killing the “Google Now” brand. It’s not getting rid of the functionality, however. That will remain and is being upgraded with an improved design and some new features, including reciprocal connections between search and your personalized content feed.

Last December, Google introduced a new “feed experience” as part of Google Now, which featured topics in one tab and a second tab for personal information and updates, such as travel plans and meetings. In the rollout today, (Google app for Android and iOS), that two-tab structure is preserved, but the feed is becoming richer and more controllable. The rollout is US only, with international markets happening in the coming weeks.

Users will be able to follow content directly from mobile search results and have that surface on an ongoing basis in their feeds. A new “follow” button will appear in some contexts, as the image above illustrates. However, most content that appears in the feed will still be determined algorithmically, based on search history and engagement with other Google properties such as YouTube.

There will also apparently be some content from locally trending topics. However, that trending content is not based on user contacts or social connections.

At a briefing Tuesday in San Francisco, the Google Team, lead by Ben Gomes, was asked several times about how these changes compared to the Facebook News Feed. The answer was: this is about you and your interests, not topics your friends are engaged with.

Intensity of user interests to be reflected

The specific topics and cards that appear are also being calibrated to reflect the intensity of your interests. If you’re more interested in travel or hip-hop or bike racing than cooking or boxing or art, that will be reflected and emphasized in your feed accordingly. In other words, interest level will be captured.

Google indicated that it will also be easy to unfollow topics: “Just tap on a given card in your feed or visit your Google app settings.” And, of course, as the company’s blog post asserts, “The more you use Google, the better your feed will be.”

Perhaps most interesting, from a “search” perspective, is that every card will have a header that will be able to initiate a mobile search with a tap. That wasn’t possible with Google Now. Thus there’s a feedback loop of sorts: search results can be followed, feed content can be searched.

It’s very much in Google’s interest to build products that keep the brand and some version of search in front of mobile users throughout the day. But Google is also trying to improve upon Now as a product, even as it gets rid of that name.

‘Vast majority’ of queries now mobile

Gomes said during the briefing that the “vast majority of our queries come from mobile.” Obviously, Google has very successfully transitioned to mobile, which wasn’t a foregone conclusion. Now it wants to give users more reasons to check in daily and new pathways into search. It’s not clear how widely Now was being used by the bulk of Google’s mobile audience.

Beyond the mobile app experience, Google said that it would be bringing the feed to the desktop version of the Chrome in the near future, though it didn’t show that off. I’m imagining it as the reincarnation of iGoogle, a personalized start page that was shuttered in 2012 — the same year Now was introduced.

Source: This article was published searchengineland By Greg Sterlin

Google’s Uptime, an experimental app that enables people to watch YouTube videos with friends, is now available to everyone who has access to the US iOS App Store.

Uptime initially launched earlier this year and was created by Google’s internal incubator, Area 120. Google’s Area 120 program encourages Google employees to spend 20% of their time working on projects that are not directly related to their job. Uptime is one of many projects to have been launched through the Area 120 program.

When Uptime initially launched it required an invite code in order to use it, but now anyone is free to download it. Those using the app can connect with their Facebook account to find other friends using the app. Connections can also be made by following others within the app.

People can use Uptime to watch YouTube videos with friends in real-time, or they can be viewed at a later time while still being able to see friends’ reactions to the video. Reactions consist of various emoji that can be tapped on while watching a video, similar to other live video streaming services.

Since the launch of Uptime earlier this year, others have been trying to imitate the idea with apps like Cabana, Let’s Watch It, Fam, and so on. The number of competing apps to enter the marketplace may have spurred the decision to launch Uptime more widely.

Despite Area 120 apps technically falling under the Google umbrella, they are not branded by Google in the App Store nor do they receive much promotion from the company. It will be interesting to see if that changes in light of competing apps gaining traction as of late.

Uptime can be downloaded from the US iOS App Store here.

Source: This article was published searchenginejournal By Matt Southern

Our current understanding of the Universe states that it's governed by four fundamental forces: gravity, electromagnetic, and the strong and weak nuclear forces.

But there are hints of a fifth force of nature, and if it exists, we'd not only be able to fill the remaining holes in Einstein's general relativity — we'd have to rethink our understanding of how the Universe actually works. And now physicists have figured out how to put this mysterious force to the ultimate test.

The four forces of nature are what holds the standard model of physics together, which is what we use to explain and predict the behaviour of particles and matter in our Universe.

At the smallest end of the scale are the two nuclear forces — the strong nuclear force is what holds atomic nuclei in place, and the weak nuclear force enables certain atoms to undergo radioactive decay.

Gravity and the electromagnetic force are on the larger end of the scale — electromagnetic force is needed to keep our molecules together, while gravity is responsible for ensuring that entire galaxies and planets aren't ripped apart.

It's all very neat and sensible, but there's a problem — in a lot of ways, gravity is the 'odd one out' in this very important group.

For one thing, gravity is the last of the four fundamental forces that humans haven't figured out how to produce and control.

It also doesn't appear to explain everything that it should — studies have shown that there's more gravity in our Universe than can be produced by all the visible matter out there.

The entity that we use to explain this gap — a placeholder called dark matter — hasn't exactly helped its case, because even our best technology can't find a trace of it.

there is a supermassive black hole at the center of our galaxyAP

Thanks to our inability to figure out what dark matter actually is, some physicists (very controversially) want to ditch gravity as a fundamental force altogether.

But instead of permanently dropping one of the fundamental forces of nature in the hopes that the Universe will make more sense without it, what if we added a fifth force that ties gravity to the others in ways we've never thought of before?

"Einstein's theory describes [gravity] beautifully well, but there's lots of evidence showing the theory has holes," says Andrea Ghez, director of the University of California, Los Angeles Galactic Centre Group.

"The mere existence of supermassive black holes tells us that our current theories of how the Universe works are inadequate to explain what a black hole is."

Ghez and her team are on the hunt for this hypothetical fifth force of nature, and say the best place to look would be somewhere in the Universe where the influence of gravity is so strong, signs of something extra will be easier to detect.

By analysing extremely sharp images of the heart of the Milky Way taken with by the Keck Observatory in Hawaii, the researchers can track the orbits of stars near our galaxy's supermassive black hole.

Based on these paths, they can measure the direct influence of gravity on the stars' movements, and figure out if something else is at play.

"This is really exciting. [O]ur work on studying stars at the centre of our galaxy is opening up a new method of looking at how gravity works," says Ghez.

"By watching the stars move over 20 years using very precise measurements taken from Keck Observatory data, you can see and put constraints on how gravity works."

The team is particularly interested in an event that's expected to take place next year, when a star called S0-2 will draw closer than ever to our galaxy's supermassive black hole, and be pulled in at maximum gravitational strength.

If there are any deviations from what general relativity predicts, this will be the best time to spot them.

"If gravitation is driven by something other than Einstein's theory of general relativity, you'll see small variations in the orbital paths of the stars," says Ghez.

This isn't the first time that physicists have actively hunted for the fifth force of nature — last year, a separate team detected signs of its influence in the energy signature of what appeared to be a new subatomic particle.

"If true, it's revolutionary," lead researcher Jonathan Feng from the University of California, Irvine, said at the time.

"If confirmed by further experiments, this discovery of a possible fifth force would completely change our understanding of the Universe, with consequences for the unification of forces and dark matter."

We're still a long way off figuring out if this force actually exists, but this new technique will be the first time scientists have ever looked for it in a gravitational field as strong as the one created by a supermassive black hole.

And even if we don't end up finding another force of nature at the heart of our galaxy, we'll likely gain a better understanding of gravity itself — something the standard model of physics desperately needs.

"It's exciting that we can do this because we can ask a very fundamental question — how does gravity work?" says Ghez.

Their research has been published in Physical Review Letters.

The internet has changed the way we discover and consume information. Think about the year 2000 — you put a keyword in the search bar and websites with the highest keyword concentration were the ones that appeared on top. Things gradually changed with the Panda, Penguin, Pigeon and other updates. The focus was more on quality of content. The search industry was further revolutionized with the introduction Google’s Instant Results around 2010. People were excited to see how the search engine offered relevant results just by reading the first few letters of a keyword.

Fast forward to 2017, the search engines have become even smarter. Their only focus is at offering the most relevant and useful information based on user preferences. Enter content discovery! Marketers are now keen on making brand content discoverable to ensure better awareness and traffic.

But what is all this buzz about content discovery? Let us take a look.

What is Content Discovery?

Content discovery is the art and science of using predictive algorithms to help make content recommendations based on how people search. Search engines and various other platforms are now using artificial intelligence (AI) to understand customer preferences and interests. This helps users to find content that’s most suitable for them.  

To understand what content discovery is all about, let us review some examples. Social media sites such as Facebook have content discovery features integrated into its algorithms. Consider the News Feed offered by the social platform. The content that appears in an individual feed is offered according to each users past behavior and personal preferences. In fact, a survey carried out by Forrester Consulting found social media to be the most preferred source of discovery for news and information among online adults between the ages of 18 and 55. The survey also revealed that a young millennial follows an average of 121 publishers on the social media.

Similar to Facebook, YouTube’s “Recommended for You” section is another example of how user activity and preferences fuel content discovery. 

Why is Content Discovery Important?

Content discovery has become more important than ever. This is because the amount of online content is increasing exponentially. Almost every brand is creating content to offer value for their audiences, which means it’s even more difficult for people to find the information they are looking for. Content discovery allows people to find information that is highly relevant and personalized. In fact, content discovery helps both consumers and online marketers. Here is how:

  • Consumers find desired data/information quickly without having to scour through hundreds and thousands of search results.
  • Online marketers can put relevant content in front of their targeted audience at the right time through the right channels.

Content discovery, helps people weed out irrelevant and unimportant information. The next question is how can brands, publishers and advertisers benefit from content discovery? 

How Content Discovery Helps Brands, Publishers and Advertisers

Marketers spend time creating high-quality content, sharing it across various channels but often fail to get the desired attention they seek. Why? There are high chances that the content gets lost in the deluge. So what can marketers do to ensure content gets seen? Here is what leaders are doing to get attention and expand their reach.

Brands, publishers and advertisers are leveraging various content discovery platforms such as TaboolaOutbrainCuriyoRenoun, and others to expand their reach and improve ROI. The content discovery tools allow the marketers to offer high-level engagement (through high-quality and relevant content), which offers them with ample opportunities to monetize and capitalize on the user engagement levels.

The platforms analyze user behaviors by considering a number of metrics such as time spent on a specific site, the path taken to reach a specific content source, search habits, preferences and others. These useful insights can be used to target content and advertisement campaigns and ensure better lead generation.

Content discovery has become even more important for brands as consumers prefer information on their mobile devices rather than on their PCs and laptops. According to Statista, the number of smartphone users across the world has increased from 1.5 billion in 2014 to 2.17 billion in 2016. The number of smartphone users are expected to rise up-to 2.87 billion by 2020. Moreover, about 20 percent millennials are no longer using their desktop to access the Web. 

content discovery.png

Source

With a number of content discovery platforms available to the marketers, publishers and advertisers, they need not worry about a user leaving their site/blog to find further information on their preferred topics.

Here are some quick tips that will help marketers present the most relevant content in front of their targeted audience:

1. Offer Users Quality Content

Marketers must focus on providing their audience with a lot of high-quality content instead of focusing on creating just a one-off piece. Different types of content appeal to different audiences, so you have to make sure you cater to a larger audience. There have been instances when a single piece of content has gotten a huge amount of attention and helped the brand to grow immensely, but that is only momentary. By creating quality content you can reap benefits for a longer period of time.

2. Focus on Multi-Channel Strategies

Facebook can help you get a lot of attention, but the audience it caters to is not multidimensional. To expand your reach, you need to think about leveraging other channels as well. So, by designing a comprehensive content marketing strategy that includes search marketing, Facebook marketing, Twitter marketing, Instagram marketing, etc., you can benefit a lot. Once you get started you can measure how each channel is contributing to the campaign and tweak your strategies accordingly.

3. Help Others and They Will Help You

Marketers often make the mistake of promoting their own content only. By sharing high-quality content from other sources or publishers, you can offer variety to your audiences and they will be more likely to come to your site when they need information. As the saying goes, “you get what you give,” there are chances where other publishers will also be willing to share your content or mention it on their blogs/websites which will help in increasing your reach.

4. Create Content That Your Audience Will Love

Who does not love instant success? But when it comes to content marketing you can never expect overnight success. The secret is to create content that your audience will love for years to come. With a long-term content discovery strategy, you can ensure more benefits. Once your audience is engaged you can move over to measuring the performance of your content and make adjustments accordingly to ensure better results.

5. Strategic Organization Aids Content Discovery

Content silos are harmful for your overall content marketing since it creates dead ends in your engagement path. Content silos form when you group content by date or type (blog posts, videos, etc.). It is a must that you organize your content by topic, since it makes content discovery easier.

While you can make your content discoverable by using search bars, internal links and a content recommendation engine; strategic organization helps users find the most relevant content quickly and easily. Moreover, organizing content by topic, persona or account helps you measure the performance and engagement level easily by using project management tools such as TrelloWorkzoneBasecamp and others. This will help you identify which content performs the best, so you can leverage it further. For instance, if a blog post does extremely well, you can then make a video, slideshow and an infographic to get a better momentum.

6. Offer Variety

Do you focus on creating textual content only? Think again. In the modern day, people consume information on the go, so they might not have the luxury to read through all the extensive articles. By offering a mix of content in your feed you can help your audience consume information in the form they prefer the most. This means you need to repurpose your content and convert it into videos, slideshows, podcasts, infographics and other forms. This will help you leverage a variety of channels and reach out to a larger audience.

Create high-quality content and use various content discovery tools to reach out to your targeted audience and maximize your ROI.

Future of Content Discovery

Attention spans are becoming shorter, so people want quick access to relevant information. Therefore, brands need to focus on delivering personalized and focused content to ensure better engagement. By gearing towards this new search model and leveraging the content distribution platforms, brands can fulfill a user’s desire to access quality content within moments.

However, this is just the start for content discovery. Content discovery is still evolving and improving so it is still not ready to replace the generic search completely. A huge number of people still prefer the traditional way to discover their desired content. But with changing user preferences, traditional search will soon become outdated and will be replaced by the modern techniques of content discovery. Therefore, marketers must be prepared to adapt to the changes and satisfy the needs of the customer.

Conclusion

User preferences are changing quickly. Users now want to access relevant and useful information as quickly as possible. This is not possible with the traditional search model since the users need to scour through numerous sources to find relevant information. Thus the rise of content discovery! It helps users find content at the nick of a time.

As content discovery becomes even more popular, more platforms need to be explored and tested to evaluate how they can further benefit the businesses. Online marketers should start integrating content discovery into their content marketing strategies to make sure they can retain their customers and keep satisfying them for a longer period of time.

Author Bio

Pratik Dholakiya is the Co-Founder of E2M, a full service digital marketing agency and PRmention, a digital PR agency. He regularly speaks at various conferences about SEO, Content Marketing, Growth Hacking, Entrepreneurship and Digital PR. Pratik has spoken at NextBigWhat’s UnPluggd, IIT-Bombay, SMX Israel, and other major events across Asia. As a passionate marketer, he shares his thoughts and knowledge on publications like Search Engine Land, Entrepreneur Magazine, Fast Company, The Next Web and the Huffington Post to name a few. He has been named one of the top content marketing influencers by Onalytica three years in a row.

Source: This article was published military-technologies.net

Research shows people who use the internet more often have more practice identifying fake news

In the early years of the internet, it was revolutionary to have a world of information just a click away from anyone, anywhere, anytime. Many hoped this inherently democratic technology could lead to better-informed citizens more easily participating in debate, elections and public discourseThe Conversation

Today, though, many observers are concerned that search algorithms and social media are undermining the quality of online information people see. They worry that bad information may be weakening democracy in the digital age.

The problems include online services conveying fake news, splitting users into “filter bubbles” of like-minded people and enabling users to unwittingly lock themselves up in virtual echo chambers that reinforce their own biases.

These concerns are much discussed, but have not yet been thoroughly studied. What research does exist has typically been limited to a single platform, such Twitter or Facebook. Our study of search and politics in seven nations — which surveyed the United States, Britain, France, Germany, Italy, Poland and Spain in January 2017 — found these concerns to be overstated, if not wrong. In fact, many internet users trust search to help them find the best information, check other sources and discover new information in ways that can burst filter bubbles and open echo chambers

Surveying internet users

We sought to learn directly from people about how they used search engines, social media and other sources of information about politics. Through funding from Google, we conducted an online survey of more than 14,000 internet users in seven nations.

We found that the fears surrounding search algorithms and social media are not irrelevant — there are problems for some users some of the time. However, they are exaggerated, creating unwarranted fears that could lead to inappropriate responses by users, regulators and policymakers.

The importance of searching

The survey findings demonstrate the importance of search results over other ways to get information. When people are looking for information, they very often search the internet. Nearly two-thirds of users across our seven nations said they use a search engine to look for news online at least once a day. They view search results as equally accurate and reliable as other key sources, like television news.

Never
 
Less than monthly
 
Monthly
 
Weekly
10%
Daily
22%
More than once daily
64%
The Conversation, CC BY-ND

In line with that general finding, a search engine is the first place internet users go online for information about politics. Moreover, those internet users who are very interested in politics, and who participate in political activities online, are the most likely to use a search engine like Bing or Google to find information online about politics.

But crucially, those same users engaged in search are also very likely to get information about politics on other media, exposing themselves to diverse sources of information, which makes them more likely to encounter diverse viewpoints. Further, we found that people who are interested and involved in politics online are more likely to double-check questionable information they find on the internet and social media, including by searching online for additional sources in ways that will pop filter bubbles and break out of echo chambers.

Internet-savvy or not?

It’s not just politically interested people who have these helpful search habits: People who use the internet more often and have more practice searching online do so as well.

That leaves the least politically interested people and the least skilled internet users as most susceptible to fake news, filter bubbles and echo chambers online. These individuals could benefit from support and training in digital literacy.

However, for most people, internet searches are critical for checking the reliability and validity of information they come across, whether online, on social media, on traditional media or in everyday conversation. Our research shows that these internet users find search engines useful for checking facts, discovering new information, understanding others’ views on issues, exploring their own views and deciding how to vote.

International variations

We found that people in different countries do vary in how much they trust and rely on the internet and searches for information. For example, internet users in Germany, and to a lesser extent those in France and the United Kingdom, are more trusting in TV and radio news, and more skeptical of searches and online information. Internet users in Germany rate the reliability of search engines lower than those in all the other nations, with 44 percent saying search engines are reliable, compared with 50 to 57 percent across the other six countries.

Germany
44.4%
France
49.7%
UK
50.9%
Overall
52.4%
US
54.2%
Spain
54.9%
Italy
55.8%
Poland
56.6%
The Conversation-US, CC BY-ND

In Poland, Italy and Spain, people trust traditional broadcast media less and are more reliant on, and trusting of, internet and searching. Americans are in the middle; there were greater differences within European countries than between Europe as a whole and the U.S. American internet users were so much more likely to consult multiple sources of information that we called them “media omnivores.”

Internet users generally rely on a diverse array of sources for political information. And they display a healthy skepticism, leading them to question information and check facts. Regulating the internet, as some have proposed, could undermine existing trust and introduce new questions about accuracy and bias in search results.

But panic over fake news, echo chambers and filter bubbles is exaggerated, and not supported by the evidence from users across seven countries.

William H. Dutton, Professor of Media and Information Policy, Michigan State University

This article was originally published on The Conversation.

Page 1 of 25

airs logo

AIRS is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Subscribe to AIRS Newsletter

Receive Great tips via email, enter your email to Subscribe.
Please wait

Follow Us on Social Media

Want to read more content?
Register as "AIRS Guest"
Enjoy Guest Account

Login or Create New Account