fbpx
Clara Johnson

Clara Johnson

Tuesday, 14 June 2016 10:39

6 Benefits of Facebook Instant Articles

In case you’re feeling out of the loop when it comes to all the fuss surrounding Instant Articles in recent weeks, let’s get you caught up. At its essence, Instant Articles is a distribution platform that allows publishers to distribute their content as native media within Facebook’s mobile app.

Last spring, Facebook launched a pilot of the platform with a select few premium partner publishers, which included The New York Times, BuzzFeed and The Atlantic. And as of last month, the platform is available to everyone. Instant Articles allow you to share content just like you would on your own website – but with interactive elements designed to enhance the mobile viewing experience. The content is accessible to Facebook users on Android and iPhone.

Best of Times or Worst of Times?


Facebook is adamant that Instant Articles are better for publishers because they’re better for end users, simply because they take far less time to display on audience members’ screens – up to 10 times faster than your own hosted mobile website. Instant Articles are HTML 5 documents are optimized for quick rendering using Facebook’s technology.

Publishers, on the other hand, are unsure. Many are wary of the platform and understandably so. There are fears over the pitfalls associated with “digital sharecropping,” increasing your business’s dependency on someone else’s property. Content professionals have been burned in the past by changes in Facebook’s algorithm causing organic reach and Facebook referral traffic to plummet, so there’s little justification to view Zuck and company as a strategic Partner.

 

 

There’s also plenty of uncertainty across the media industry surrounding the viability of monetization models in the age of desktop browser ad blocking and mobile advertising’s transparency woes.

Depending on what you’re aiming to accomplish with your content, Instant Articles may be a viable solution for you. If you’re in the publishing business, your content – not your website – is your product, so it’s easy to argue that distributing your product and resonating with relevant people is your primary goal, regardless of who owns the platform you use.

 

 

Trying the Platform on for Size


Perhaps it’s time to let go of your digital sharecropping stigmas and start viewing your content as a fully detachable asset. Perhaps you don’t need to focus on driving traffic to a fully owned media property. The Instant Articles platform is relatively easy to experiment with, too, so you don’t need to fully commit to try it out.

A WordPress plugin is available to help streamline uploads and improve workflow efficiency. And if you’re using Drupal, there’s an integrated solution available for you too. Once you take care of basic setup as a one-time requirement, it’s possible to automate Instant Article production directly from your content management system. Facebook’s own tools also allow publishers to preview Instant Articles-powered content, to track the performance of all the items in your publishing feed and to edit content manually.

So despite all the arguments to avoid Instant Articles like the plague, there are some compelling reasons to give it a shot and see how well it serves your goals as a content producer. Here are six of the most enticing benefits that Facebook’s Instant Articles platform has to offer.

1. Faster Load Times Mean Better Audience Experiences


Loading time is critical to user experience, with data indicating that even a single second delay can cause up to a 7% loss in conversions. When the content we want doesn’t appear on our small screens within seconds, we lose patience and bounce.

Since Instant Articles are optimized and load within the app rather than on the standard mobile web, they render up to ten times faster, making it easier for readers to view, enjoy and share. Views and shares are critical to any social media-based content distribution efforts, after all, so technology that helps improve those metrics is useful. Facebook reports that Instant Articles have 70% lower bounce rates and 30% higher share rates than standard mobile web articles.

2. You Can Monetize With the Platform


Facebook allows publishers to sell and serve their own display and rich media ads, while keeping 100% of the revenue. Publishers also have the option to display ads from the Facebook Audience Network, to help monetize any unsold inventory. With this option, the publisher keeps 70% of the revenues and gives Facebook a 30% taste.

 

screen_monetization

 

 

 

Facebook’s overall growth is fueled by mobile ads, which accounts for some 78% of the company’s ad revenue and 74% of their total revenue. So there’s a solid chance they’ll do a good job of profiting from the content you distribute on Instant Articles, and you’ll be in the position to make at least twice what they do from each ad tap.

3. A Proven Content Discovery Network


People are already using Facebook more than any other channel to find content to consume, so there’s no doubt that the platform can deliver audience members – if they are motivated to do so.

There are even signs that Instant Articles can get people interested in exploring the content on your site better than organic Facebook Page posts can. Facebook’s data shows the Instant Articles format translates to 20% higher click-through rates on links inside content than other mobile publishing formats.

4. Control Over Branding


Publishers have decent if not complete control of content experience branding options with Instant Articles. You can adjust colors and fonts and even include logos and other embedded media. So even though the publishing platform doesn’t have the branding versatility of your own property, the control you have here is far greater than what you have on LinkedIn Pulse or even Medium, for example.

This creates a more powerfully branded experience for your audience, which is important for memorability. When you consider that the majority of people are better at remembering what we see than what we hear, your brand’s visual appearance is extremely important. However, memory improves if we see and hear something, so the additional audio and visual features help to drive memorability home.

5. Integrations With Measurement Tools


With Instant Articles, publishers can still measure page views through a variety of analytics tools, including comScore, Chartbeat, Google Analytics, Omniture, and Adobe Analytics. If you integrate your reporting data properly, you’ll be able to track content paths to conversion from a single dashboard, regardless of if the content in question is on Instant Articles or your own website. You can also use a third-party tool to create a custom dashboard for tracking Instant Articles performance alongside metrics from any number of other platforms.

Tools like ShareThis make it possible to split test Instant Articles headlines. This way, you can still keep an eye on which types of content are resonating most with your audience, to continually drive traffic and draw in additional readers.

6. Immersive Full-Screen Experience


Mobile-only usage is increasing, and 30% of Facebook’s active users access the network solely using their phones. With the full-screen content experience of Instant Articles, publishers have more control over what mobile users see and do. Here Facebook seems to be taking a cue from Snapchat, where users are known to opt for faster consumption and the undivided attention demanded by full-screen video posts.

 

screen_tilt_to_explore_photo

 

 

screen_maps

 

 

screen_maps

 

 

 

The richer content experience of Instant Articles, moreover, aims to better serve readers with a variety of features, such as auto-play video, embedded audio captions, tilt-to-pan photos and interactive maps.

The Game has Changed


Facebook’s Instant Articles platform definitely shows promise. Time will tell how publishers are affected by this rollout, and there are plenty of reasons to be wary.

On the other hand, there are some compelling reasons to consider trying out Instant Articles as a mobile-optimized, socially integrated solution for content distribution and monetization. And it’s easy enough to automate some experiments with the platform with minimal onboarding friction, so if you consider yourself to be a relatively brave and cunning content marketer, why not give it a shot and compare performance with your onsite assets?

Source:  https://www.searchenginejournal.com/what-are-the-benefits-of-facebook-instant-articles/164225/?ver=164225X2

 

 

 

 

 

IT security experts are developing a new method for detecting and fixing vulnerabilities in the applications run on different devices – regardless of the processor integrated in the respective device.

 

The number of devices connected to the Internet is continuously growing – including household appliances. They open up numerous new attack targets 

IT security experts from Bochum, headed by Prof Dr Thorsten Holz, are developing a new method for detecting and fixing vulnerabilities in the applications run on different devices -- regardless of the processor integrated in the respective device.

In future, many everyday items will be connected to the Internet and, consequently, become targets of attackers. As all devices run different types of software, supplying protection mechanisms that work for all poses a significant challenge.

This is the objective pursued by the Bochum-based project "Leveraging Binary Analysis to Secure the Internet of Things," short Bastion, funded by the European Research Council.

A shared language for all processors

As more often than not, the software running on a device remains the manufacturer's corporate secret, researchers at the Chair for System Security at Ruhr-Universität Bochum do not analyse the original source code, but the binary code of zeros and ones that they can read directly from a device.

However, different devices are equipped with processors with different complexities: while an Intel processor in a computer understands more than 500 commands, a microcontroller in an electronic key is able to process merely 20 commands. An additional problem is that one and the same instruction, for example "add two numbers," is represented as different sequences of zeros and ones in the binary language of two processor types. This renders an automated analysis of many different devices difficult.

In order to perform processor-independent security analyses, Thorsten Holz' team translates the different binary languages into a so called intermediate language. The researchers have already successfully implemented this approach for three processor types named Intel, ARM and MIPS.

Closing security gaps automatically

The researchers then look for security-critical programming errors on the intermediate language level. They intend to automatically close the gaps thus detected. This does not yet work for any software. However, the team has already demonstrated that the method is sound in principle: in 2015, the IT experts identified a security gap in the Internet Explorer and succeeded in closing it automatically.

The method is expected to be completely processor-independent by the time the project is wrapped up in 2020. Integrating protection mechanisms is supposed to work for many different devices, too.

Helping faster than the manufacturers

"Sometimes, it can take a while until security gaps in a device are noticed and fixed by the manufacturers," says Thorsten Holz. This is where the methods developed by his group can help. They protect users from attacks even if security gaps had not yet been officially closed.

Source:  https://www.sciencedaily.com/releases/2016/06/160609064300.htm

If you searched for "three white teenagers" on Google Images earlier this month, the result spat up shiny, happy people in droves – an R.E.M. ballad in JPG format. The images, mostly stock photos, displayed young Caucasian men and women laughing, holding sports equipment or caught whimsically mid-selfie.

If you searched for "three black teenagers", the algorithm offered an array of mugshots.

A soon-to-graduate senior at Clover Hill High School in Midlothian, Virginia in the US, named Kabir Alli, recorded the disparity – and, as any enterprising 18-year-old would, posted the video to Twitter. The result was swift and massive virality, as his video was shared more than 65,000 times. (Similar observations had been made before, by YouTube videographers and others, but had not quite so deeply lodged in the internet's consciousness.)

Before he made the video, friends told Alli about what the Google search would pull up. But the teenager says watching it happen in person was still a surprise. "I didn't think it would actually be true," Alli said in an interview with USA Today. "When I saw the results I was nothing short of shocked."

Google responded that its search algorithm mirrors the availability and frequency of online content. "This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what image search results appear for a given query," the company said in a statement to the Huffington Post UK. "These results don't reflect Google's own opinions or beliefs – as a company, we strongly value a diversity of perspectives, ideas and cultures."

Algorithms like the ones that power Google's search engine tend to be esoteric or trade secrets or both, giving the software an air of mystery. Considering algorithms are, after all, lines of code, it is understandable why we might want to perceive them as unerring, impartial decision makers. That is a tempting view – but it is also, experts say, incorrect.

As David Oppenheimer, a University of California, Berkeley law professor said about algorithms to the New York Times in 2015: "Even if they are not designed with the intent of discriminating against those groups, if they reproduce social preferences even in a completely rational way, they also reproduce those forms of discrimination."

Google Image searches for "beautiful dreadlocks" yield mostly dreadlocked white people, as Buzzfeed UK pointed out in April, with some critics citing this Eurocentric bent as an example of racism. Alli, for his part, told USA Today that he does not believe Google is racist, saying that "black males making poor choices also plays a major role". But others disagree that Google should be so readily absolved. Safiya Umoja Noble, an African American studies professor at the University of California, Los Angeles argued to USA Today that Google has a responsibility to eliminate racial bias from its algorithm.

"It consistently issues a statement that it's not responsible for the output of its algorithm," Noble said. "And yet we have to ask ourselves: If Google is not responsible for its algorithm, who is?"

Even if human programmers do not reflect widespread discrimination, intentionally or not, they can introduce bias through omission. Google also came under fire in July 2015 when its photo app autonomously labelled a pair of black friends as animals. As The Washington Post noted, the engineer in charge of the program believed the underlying program was fine, but the data used during training of the algorithm was "faulty" – suggesting Google, perhaps, neglected to run a sufficient number of minority portraits through the AI.

Apparent algorithmic bias can be based on political views or gender as well. Facebook was accused of burying conservative news in its trending topics section on users' homepages on the social network, though many of the allegations focused on human curators; Facebook later met with conservative leaders, and ended relying on an algorithm that monitors media websites. And a blurb for a job with a "$US200K+" salary, which appeared in Google's ad program, was almost six times more likely to be shown to a man than a woman, a Carnegie Mellon University study found; it was unclear if advertisers wanted to target men, the scientists concluded, or if Google's software indicated men were more likely to click on the ad.

That is not to say that all algorithmic bias must be bad. Roboticists are creating artificially intelligent robots that stereotype to make quick, crucial decisions. Last year, Georgia Tech researcher Alan Wagner built an experimental machine to observe how a robot might distinguish between civilians and emergency personnel in a disaster.

Based on features like uniforms, for instance, Wagner's program was able to determine if someone was a police officer. But it also concluded that all firefighters must have beards – simply because the simulated firemen all happened have facial hair during the experiment. Wagner now advocates that young artificial intelligences need "perceptual diversity", as he described to the website Inverse – for instance, showing a broad swath of humanity to a program being trained to recognise faces.

The Google Image results for three white or black teenagers now reflect that the algorithm has learned the debate over bias is popular, showing photo montages linked to news stories like the one you are currently reading.

But racial disparities among Google's treatment of teens are still easy to find: a Google video result for "three white teenagers" brings up YouTube news clips about the image result snafu. The same search, with "Asian" substituted for "white", yields dozens of links to pornography.

Source:  http://www.smh.com.au/technology/web-culture/google-under-fire-again-for-racist-search-results-20160613-gphpba.html

This week saw the publication of Mary Meeker’s annual Internet Trends report, packed full of data and insights into the development of the internet and digital technology across the globe.

Particularly of interest to us here at Search Engine Watch is a 21-page section on the evolution of voice and natural language as a computing interface, titled ‘Re-Imagining Voice = A New Paradigm in Human-Computer Interaction’.

It looks at trends in recognition accuracy, voice assistants, voice search and sales of devices like the Amazon Echo to build up an accurate picture of how voice interface has progressed over the past few years, and is likely to progress in the future.

So what do we learn from the report and Meeker’s data about the role of voice in internet trends for 2016?

Voice search is growing exponentially

We know that voice is a fast-rising trend in search, as the proliferation of digital assistants and the advances in interpreting natural language queries make voice searching easier and more accurate.

But the figures from Meeker’s report show exactly to what extent voice search has grown over the past eight years, since the launch of the iPhone and Google Voice Search in 2008. Google voice queries have risen more than 35-fold from 2008 to today, according to Google Trends, with “call mom” and “navigate home” being two of the most commonly-used voice commands.

A slide from Meeker's trends report showing the rise in Google Voice Search queries since 2008. The heading reads, "Google Voice Search Queries = Up >35x since 2008 and >7x since 2010, per Google Trends". The graph below it tracks the rise of three terms: "Navigate Home", "Call Mom" and "Call Dad", represented by a red line, a blue line and an aqua line respectively". All three terms have fairly low growth from 2008 to 2013, followed by a rapid rise with several sharp peaks upwards from 2013 to 2016. The 'Call Dad' trend grows the least, with the 'Call Mom' trend rising the fastest, briefly overtaken by 'Navigate Home' in 2015.

Tracking the rise of voice-specific queries such as “call mom”, “call dad” and “navigate home” are an unexpected but surprisingly accurate way to map the growth of voice search and voice commands. As an aside, anyone can track this data for themselves by entering the same terms into Google Trends. It’s interesting to think what the signature voice commands might be for tracking the use of smart home hubs like Amazon Echo in a few years’ time.

Google is, of course, by no means the only search engine experiencing this trend, and the report goes on to illustrate the rise in speech recognition and text to speech usage for the Chinese search engine Baidu. Meeker notes that “typing Chinese on a small cellphone keyboard [is] even more difficult than typing English”, leading to “rapidly growing” usage of voice input across Baidu’s products.

A slide from Meeker's report showing growth in Baidu voice input. The header reads, "Baidu Voice = input growth >4x... Output >26x, since Q2: 14". Below it are two graphs showing upward trends in usage between Q2 of 2014 and Q1 of 2016. The left-hand graph, Baidu Speech Recognition daily usage, has a steady upward climb with a slight plateau between Q2 and Q3 of 2015, followed by a much sharper increase. The right-hand graph, Baidu Text to Speech daily usage, shows a very gradual rise from Q2 in 2014 to Q2 in 2015, followed by a steep rise up to the present day.

Meeker also plots a timeline of key milestones in the growth of voice search since 2014, noting that 10% of Baidu search queries were made by voice in September 2014, that Amazon Echo was the fastest-selling speaker in 2015, and that Andrew Ng, Chief Scientist at Baidu, has predicted that by 2020 50% of all searches will be made with either images or speech.

While developments in image search haven’t been making as much of a splash as developments with voice, it shouldn’t be overlooked, as the technology that will let us ‘search’ objects in the physical world is coming on in leaps and bounds. In April, Bing implemented an update to its iOS app allowing users to search the web with photos from their phone camera, although the feature is limited to users in the United States, as they’re the only ones who can download the app.

The visual search app CamFind, which has been around since 2013, also has an uncanny ability to identify objects in the physical world and call up product listings, which has a huge amount of potential for both search and marketing.

Why do people use voice?

The increase in voice search and voice commands is not only due to improved technology; the most advanced technology in the world still wouldn’t see widespread adoption if it wasn’t useful. So what are voice input adopters (at least in the United States) using it to do?

The most common setting for using voice input is the home, which explains the popularity of voice-controlled smart home hubs like Amazon Echo. In second place is the car, which tallies up with the most popular motivation for using voice input: “Useful when hands/vision occupied”.

30% of respondents found voice input faster than using text, which also makes sense – Meeker observes elsewhere in the report that humans can speak almost 4 times as quickly as they can type, at an average of 150 words per minute (spoken) versus 40 words per minute (typed). While this has always been the case, the ability of technology to accurately parse those words and quickly deliver a response is what is really beginning to make voice input faster and more convenient than text.

As Andrew Ng said, in a quote that is reproduced on page 117 of the report, “No one wants to wait 10 seconds for a response. Accuracy, followed by latency, are the two key metrics for a production speech system…”

The third-most popular reason for using voice input, “Difficulty typing on certain devices”, is a reminder of the important role that voice has always played, and continues to play, in making technology more accessible. The least popular setting for using voice input is at work, which could be due to the difficulty in picking out an individual user’s voice in a work environment, or due to a social reluctance to talk to a device in front of colleagues.

Meeker’s report also looks into the usage of one digital assistant in particular: Hound, an assistant app developed by the audio recognition company SoundHound, and which was also recently used to add voice search capabilities to SoundHound’s music search engine of the same name.

What’s interesting about the usage breakdown for Hound, at least among the four fairly broad categories that the report divides it into, is that no one use type dominates overwhelmingly. The most popular use for Hound is ‘general information’, at 30%, above even ‘personal assistant’ (which is what Hound was designed to do) at 27%.

Put together with the percentage of queries for ‘local information’, more than half of voice queries to Hound are information queries, suggesting that many users still see voice primarily as a gateway into search. It would be interesting to see similar graphs for usage of Siri, Cortana and Google’s assistants to determine whether this trend is borne out across the board.

A tipping point for voice?

Towards the end of the section, Meeker looks at the evolution and ownership of the Amazon Echo, which as a device which was specifically designed to be used with voice (as opposed to smartphones which had voice capabilities integrated into them) is perhaps the most useful product case study for the adoption of voice commands.

Meeker notes on one slide that computing industry inflection points are “typically only obvious with hindsight”. On the next, she juxtaposes the peak of iPhone sales in 2015 and the beginning of their estimated decline in 2016 with the take-off of Amazon Echo sales in the same period, seeming to suggest that one caused the other, or that one device is giving way to the other for dominance of the smart device market.

I’m not sure if I would agree that the Amazon Echo is taking over from the iPhone (or from smartphones), since they’re fundamentally different devices: one is designed to be home-bound, the other portable; one is visual and the other is not; and as I pointed out above, the Amazon Echo is designed to work exclusively with voice, while the iPhone simply has voice capabilities.

But it is interesting to view the trend as part of a shift in the computing market towards a different type of technology: an ‘always-on’, Internet of Things-connected device specifically designed to work with voice, and perhaps that’s the point that Meeker is making here.

Meeker points to the fast movement of third-party developers to build platforms which integrate the Alexa voice assistant into different devices as evidence of the expansion of “voice as computing interface”. While I think we will always depend on a visual interface for many things, this could be the beginning of a tipping point where voice commands take over from buttons and text as the primary input method for most devices and machines.

Hopefully Meeker will revisit this topic in subsequent trends reports so that we can see how things play out over the next few years.

Source:  https://searchenginewatch.com/2016/06/03/what-does-meekers-internet-trends-report-tell-us-about-voice-search/

Saturday, 11 June 2016 12:14

An SEO Game Plan for Impatient Marketers

SEO is a long-term strategy

 

You have to work hard to develop a body of highly optimized content for readers, and then you have to make efforts off-site to prove the relevance and authority of your website to search engines.

 

For the impatient marketer just starting out with their SEO strategy, the task can seem less than worthwhile. If this sounds familiar – and you need some quick SEO wins – this blog post is for you.

 

Here’s an eight-step SEO game plan for impatient marketers.

 

Step 1: Make Sure You’re Optimized for Mobile
First things first:

 

Is your site mobile-compatible? If not, this is one of your most important SEO tasks.

If you’re not sure, you can check right now using Google’s Mobile-Friendly Test:

 

Mobile Friendly Test Website Screenshot

 

 

 

Just enter your URL and Google will tell you in seconds.

 

Optimizing for mobile is a really important step to get quick gains in SEO. For one, people in the US are now spending 61% of their time online using a mobile device. If someone accesses your site through mobile, and it’s not optimized, they’re much more likely to bounce, which is going to affect your results in SERPs.

 

And have you heard of Mobilegeddon? This is what marketers are calling Google’s April 2015 ranking algorithm update – one that was designed to boost the rank of mobile-friendly pages in search.

 

Step 2: Create a Site Map


If you’re already mobile-compatible, then the next step is making sure you have a sitemap.

A sitemap gives search engines detailed information about what pages are on your website, which will make it easier for them to crawl it. You can create one using XML Sitemaps for free (up to 500 pages).

 

XML Website Screenshot

 

 

 

If you use WordPress, there are simple plugins you can use to build a sitemap, such as Yoast. Otherwise, take your XML Sitemap and upload it to the root file of your site like this: /sitemap.xml.

 

Step 3: Get Set up With Search Console


If you haven’t already, you should sign up for Google Search Console and Bing Webmaster Tools, and verify your website.

 

Then you can submit your sitemap to each search engine. Here are the instructions for Google, and here are the ones for Bing.

 

On Search Console, just go to your dashboard, click ‘Crawl’ on the side navigation, then ‘Sitemaps.’ You’ll then see the option to ‘Add/Test Sitemap.’

 

Google Search Console Screenshot

 

 

 

Then just follow the prompts to get it set up.

 

Step 4: Check Your Site Speed


Now that you’re on Webmaster Tools, it’s really easy to check your site speed. In Google Analytics, click the “Reporting” tab in the top navigation.

 

Then, in the side toolbar, click Behavior > Site Speed > Overview to see your data.

 

Site Speed Menu Screenshot

 

 

 

Site speed is an important rank factor for SERPs, and it affects your bounce rate. People are impatient – the slower your website loads, the more likely they are to leave. In fact, 47% of consumers expect pages to load in 2 seconds or less.

 

Site speed also has an impact on your conversions—a 1-second loading delay can cause up to a 7% loss in conversions. This means you should do everything you can to make sure your website runs as fast as possible.

 

Google offers its own speed suggestions in the Site Speed drop-down menu. You can also look into different site speed fixes based on your platform (WordPress, Weebly, etc.). Here are some other common ways your content or setup can slow down your site speed.

 

 

Step 5: Make Sure Google is Crawling Your Website Properly


It’s important to make sure that Google is crawling your site properly – if it’s not, your content might not appear in search results at all.

 

If you’ve already submitted your sitemap to Search Console, check their Crawl Errors Report to see if they had problems crawling any of your pages. You should check back with this often as your website grows.

 

If you don’t already have one, create a robots.txt file in Google Analytics. If you have one already, check and make sure it’s not blocking any important URLs.

 

There are also many external tools to help you crawl your website, including:

Screaming Frog
SEO Chat
Webmaster World


Step 6: Check for Missing and Duplicate Data


Next, you should run your own crawler to identify any missing or duplicate data. Most of the crawling tools out there are paid, but Screaming Frog will give you a free trial for up to 500 URLs.

 

Here’s what you should be looking out for:

 

Missing ALT tags
Missing (or duplicate) meta descriptions
Missing (or duplicate) H1 and H2 tags
Duplicate pages
404 errors


Use this information to go back through and fill in any gaps in your website’s data.

Google doesn’t like websites with duplicate content, so it’s important to either block duplicate pages with your robots.txt file, or make the content unique.

 

Step 7: Start Optimizing for Local Search


Google is increasingly favoring locally optimized results for SERPs.

For example, if I search for “phone accessories,” Google doesn’t show me online retailers:

 

Google Search Screenshot for Cell Phone Accessories

 

 

 

Google used my IP address to highlight phone accessory options near me, before the first organic result.

So if you want some quick SEO wins, focus on local.

 

First, add your business to Google My Business. Add your business name, address, and phone number (NAP) and make sure the information is accurate.

 

Google compares your My Business NAP to your other NAP listings around the web. If there are discrepancies, it will affect your rank.

 

Next, you should go to the other popular search data providers and make sure your NAP is available for Google to find. Check Yelp, Bing, Yellow Pages, and other listings relevant to your industry.

 

You need to make sure your NAP is correct everywhere it appears. Moz Local can help you find your other listings using just your business name and zip code:

 

Moz Local Screenshot

 

 

 

For incorrect NAPs, either update it yourself or contact the website to tell them the information is wrong.

 

Step 8: Look for Long-Tail Keyword Opportunities


At this point, you’ve sorted out most of the technical issues that can really affect your SEO. Now, you have no choice but to move on to keyword optimization.

 

Still, there are some ways to get quick wins when trying to rank for keywords. The best one is looking for long-tail keywords.

 

Ranking for the most popular keywords in your industry can take years – or it could never happen at all – but if you find the right long-tail keywords to optimize for, you can shoot to the top of rankings.

 

According to Moz, a huge portion (70%) of keywords have relatively low demand.

Long-tail keywords are discovered by thinking about user intent. When your audience sits down at the search engine, what are they typing in?

 

For example, say I’m an online retailer of phone accessories. Ranking for keywords like “iphone case” or “screen protectors” will be difficult — you’d be going up against the biggest brands:

 

 

Google Search Screenshot for phone cases

 

 

But a lot of the time, your audience wants something specific. “Fast charge wireless charging pad” or “Galaxy Note 5 Flip Cover Case” are long-tail keywords that would have much less rank competition.

 

Finding good long-tail keywords is mostly about imagining user intent. Once you come up with some options, you can determine how relevant and competitive they are using Google Adwords.

 

If you’re searching for long-tail keywords in particular, I recommend using Keyword Tool Dominator. It uses Google autocomplete to help find relevant long-tail keywords right from search:

 

 

Keyword Tool Dominator Website Screenshot

 

 

 

Conclusion


There’s no getting around the long-haul optimization strategies if you want to get and maintain the highest rank possible. But if you’re just starting out with SEO, there are a lot of quick tricks you can use to jumpstart your efforts.

Follow the eight steps above to start your SEO game plan for impatient marketers.

 

Source:  https://www.searchenginejournal.com/serps-success-seo-game-plan-impatient-marketers/158977/

 

 

 

 

 

 

 

 

 

 

 

It’s Friday, so welcome to our weekly round up of search marketing and related news.

This week we have the 16 companies dominating Google, stats on retailers’ search budgets, and a look at accusations around Google and searches for Hillary Clinton.

Is Google manipulating searches for Hillary Clinton? Er,no…

There’s been talk of Google manipulating autocomplete suggestions for searches on Hillary Clinton. A video from SourceFed claims that searches around Clinton are being manipulated as they don’t return the suggestions they would expect to find.

Specifically, searches such as “Hillary Clinton cri-” did not suggest “Hillary Clinton criminal charges” and “Hillary Clinton in-” did not return “Hillary Clinton indictment.”

SEO and reputation management expert Rhea Drysdale does an excellent job of debunking the theory in a post on Medium.

Essentially SoureFed failed to compare similar searches for Donald Trump, which fail to suggest phrases like “Donald Trump lawsuit”.

trump la

In a nutshell, if Google is manipulating searches for Clinton, it’s doing the same for Trump. There’s another theory too – the popularity of the SourceFed video has led to thousands trying out these searches for themselves, thus potentially manipulating these results.

Google becomes the world’s most powerful brand

Apple’s value has dropped 8% to $228 billion in the past year, while Google’s has risen 32% to reach $229 billion. So Google takes top spot in Millward Brown Digital’s annual report.

mill brown

Amazon’s search spending

Fractl has analyzed the marketing spend of some of the biggest retailers, and search gets the lion’s share of Amazon’s budget.

During the period studied, the ecommerce giant spent $8 million on TV and radio, $54 million on print and $1.35 billion on search.

amazon-budget

For more stats, see Mike O’Brien’s piece on the research.

In search, do the the rich just get richer?

Earlier this week, Chris Lake covered an excellent Glen Allsop study into how 16 companies are dominating Google’s results.

As Chris says in his post:

In this case, the rich are major publishing groups. The way they are getting richer is by cross-linking to existing and new websites, from footers and body copy, which are “constantly changing”
And these are the big 16:

top16

Source:  https://searchenginewatch.com/2016/06/10/four-of-the-most-interesting-search-marketing-news-stories-of-the-week/

Google announced a slate of major updates and new products for AdWords advertisers in its Google Performance Summit today. Their largest advertisers and partners had a sneak peek at the announcement yesterday, and I’m excited to bring you the details on what’s new!

The updates were revealed by Sridhar Ramaswamy, Senior Vice President of Ads & Commerce and Jerry Dischler, Vice President of Product Management, AdWords. Here’s what they had to say about exciting changes coming to AdWords:

Crazy New AdWords Stats

Google AdWords is more effective than ever and driving massive economic activity, according to Google. Among the statistics they shared in their presentation:

Google’s search and advertising tools drove $165 billion of economic activity for over 1 million businesses, website publishers, and nonprofits across the United States in 2015.
There are now trillions of searches on Google.com and over half of those searches happen on smartphones.

Nearly one third of all mobile searches on Google are related to location. In fact, location-related searches are growing 50% faster than other mobile queries.
Since AdWords store visits were introduced two years ago, advertisers have measured over 1 billion store visits worldwide.

AdWords Redesign & New Features

In the announcement, Ramaswamy writes: “To help marketers succeed in this mobile-first world, we have redesigned AdWords — from the ground-up — and re-thought everything from creatives and bidding, to workflow and measurement.”

He added that Google has realized that accounting for mobile and actually designing for mobile-first are two very different things, which had resulted in their completely changing how they think about and build AdWords.

First, they created Universal App Campaigns, which have driven more than 2 billion app downloads since they came out. Now, Google is launching some amazing new products:

Expanded Text Ads

Responsive Ads for Display

Individual Bid Adjustments for Device Types

Local Search Ads for Google.com & Google Maps

NEW! Expanded Text Ads in AdWords

Did you think Google would EVER change its ad text limits? The announcement revealed that taking away right sidebar ads was part of the preparation for this new product, Expanded Text Ads.

Now, you’ll have more room to sell your wares on the SERPs, with two 30-character headlines, 80 characters for description, and an auto-extracted URL with customizable domain path.

upgraded ad components

his is AMAZING and is going to make Quality Score even more important, as those top spots are going to take up more prime real estate. This is going to make anything below #2 or #3 even more useless.

Expanded text ads are optimized for the screen sizes of the most popular smartphones. Google reports that early advertising tests show up to 20% increases in CTR. Expanded Text Ads will roll out later this year.

Responsive Ads for Display

New responsive ads for display adapt to the content on the site on which they appear. Google promises, “They also unlock new native inventory so you can engage consumers with ads that match the look and feel of the content they’re browsing.”

homespiration ad

Advertisers just need to provide a headline, description, image and URL. Google will do the rest.

Individual Bid Adjustments for Device Types

Bid adjustments are a super important tool for controlling how much you pay and where you appear according to different parameters. You can adjust your mobile bids, for example, by setting a percentage you’re willing to pay (more or less) against desktop.

Now, Google is also letting you set individual bid adjustments by device type, so you can choose to bid more or less for mobile, desktop or tablets. They widened the adjustment range, too, allowing up to 900% variation.

Local Search Ads for Google.com & Google Maps

Almost a third of mobile searches have local intent, Google said. Their new local search ads for Google.com and Google Maps give advertisers using location extensions more real estate on searches for specific products or services in that geographic area.

walgreens1

Searchers will be able to see special offers or browse available inventory right from the Google Maps ad.

Source:  https://www.searchenginejournal.com/major-google-adwords-changes-announced-expanded-text-ads-new-local-search-ads/164622/

 

Tan Sthanunathan, senior vice president, consumer and market insights at Unilever, recently spoke during the opening keynote of the MRS’s annual conference, Impact 2016, which took place in London on March 15 and 16. Taking a look at the “seismic changes” within the technology and digital world, Sthanunathan’s presentation examined how traditional agencies can adhere to new rules to keep up with the pace of change and survive in a marketplace “in which data is becoming increasingly democratized,” according to an article by Research Live.

“What we have is incredible access to information,” Sthanunathan said. “If we all think that information is going give us a competitive edge and we’re going to use it to become great, don’t think that’s the case. You can get answers to a lot of questions by searching on Google.”

the ten CommandmentsThe article provided the 10 commandments Sthanunathan suggested agencies and clients should follow:

Get social or get ready to be branded anti-social. Mine the information gleaned from social media.
Data is commoditized but insights are getting democratized – a Google consumer survey costs as little as £500.

Get visual or get impaired. Think how to bring insights to life using a fact-based, rather than fact-filled presentation.

Innovate, don’t renovate. Renovating comes naturally because it’s easy. “But renovation in my way of thinking is more like polishing a turd,” Sthanunathan said.
Become the master of metamorphosis – change on an ongoing basis, change every day.

Digitize and humanize. Tame data.

Think bi-polar.

It’s too risky not to take risks. Be bolder than you have been traditionally.
Never underestimate the power of N=1 – brands are increasingly being influenced by people.
Real-time is the new currency, cutting the time lapse between asking the question and getting the answer.

Source:  http://quirksblog.com/blog/2016/03/21/10-commandments-for-keeping-up-with-the-pace-of-change/

 

This week Bing released a new tool for content publishers to get their work discovered by more readers. Bing News PubHub allows publishers to submit their news sites for distribution to Bing users. The company claims that publishers of all sizes will be able to use this tool to reach more of the Bing audience with new and interesting content.

As an argument to consider Bing News PubHub as news portal to share your content on, the company makes the following claims:

More than 20% of the US desktop search market uses Bing (this can be proven with the latest comScore numbers), which helps them get the “most comprehensive and relevant news” (difficult to back up this claim with real figures).

Millions of Windows 10 users search on Bing through Cortana, and discover content through the Outlook News Connector.

News is available in the Bing Search app, which is available on iOS and Android.

“When publishers submit their content through the Bing Publisher Network, they’ve just expanded their reach significantly, giving their stories and outlets even greater exposure.”

To get started using Bing PubHub you first need to become a verified publisher, which can be done in three steps:

Follow Bing Webmaster Guidelines, which is a set of rules similar to Google’s Webmaster Guidelines
Verify that you’re the owner of the site by using Bing’s Webmaster Tools, which is dashboard similar to Google’s Search Console.

Finally, fill out the form here to submit your site for consideration
However, even after going through that entire process, your content still might not meet Bing’s criteria for submission if it does not meet the following requirements:

Newsworthiness: Content that reports on timely events and topics that are interesting to users.
Originality: Content that provides unique facts or points of view.

Authority: Identify sources, authors, and attribution of all content.
Readability: This includes creating content with correct grammar and spelling, as well as a site design that’s easy for users to navigate.

According to the Bing News Team, more updates and features are said to be on the way.

Source:  https://www.searchenginejournal.com/expand-reach-content-bing-news-pubhub/165555/

This story has been going on for years, but it looks like France’s equivalent of the IRS (Direction générale des finances) wants more proof. According to Le Monde and Le Parisien, Google’s office in Paris is being raided right now. According to the DGF, Google should be paying more taxes in France as the company has been doing more than just tax-optimization strategies. Google may be facing a $1.8 billion fine (€1.6 billion).

According to Le Parisien, a hundred DGF employees have started collecting documents since 5 AM this morning. It’s unclear whether this raid is related to the ongoing investigation, but it seems very likely.

Back in 2012, Le Canard Enchaîné revealed that Google was facing a $1.3 billion fine (€1 billion) for tax penalties in a tax noncompliance case. In 2011, Google France reported €138 million and paid €5.5 million in taxes. Comparatively, the Irish subsidiary had been doing amazingly well, reporting €12.4 billion in revenue the same year.

Google doesn’t hide that it’s been doing tax optimization like countless of other tech giants, such as Apple, Amazon, Facebook and others. Tax optimization isn’t illegal. According to Google, they only handle support and don’t sign contracts in France. Yet, salespeople were based in France in 2011 and signed French contracts with French clients. This is key to this investigation.

Advertising contracts with French advertisers were taxed in Ireland in 2011. But the DGF considers them as French contracts and expects Google to pay taxes for them. Google contested the accusation.

In 2014, Le Point confirmed that there was an ongoing investigation against Google in France. At the time, Google even admitted that the company expected to pay a huge fine and made a provision.

In February 2016, Reuters and the AFP reported that the DGF asked Google to pay $1.8 billion (€1.6 billion) for the same case. The DGF and Google declined to comment.

Here’s a glimpse of the DGF investigation. Google’s European HQ is called Google Ireland Holdings. It is the owner of another company, Google Ireland Limited. Google Ireland Limited cashes in all the revenue from all European subsidiaries. But, in order to lower the tax rate, Google Ireland Limited pays billions of royalties to Google Ireland Holdings. In 2011, it was $4.6 billion. It drastically lowers the profit of Google Ireland Limited.

Despite the name, Google Ireland Holdings’ cost center is in Bermuda and is called Google Bermuda Unlimited. In Bermuda, corporate tax doesn’t even exist. But there is a tax if you want to transfer big sums from Ireland to Bermuda. That’s where the Netherlands comes in.

If you transfer money from Ireland to the Netherlands, then to Bermuda, there is no tax. Google Netherlands Holdings BV, you guessed it, is a subsidiary that only transfers money from Ireland to Bermuda.

Again, none of this is illegal. The main issue in France is that some Irish contracts could be French and could be subject to French taxes. That’s why Google is being raided right now.

Source:  http://techcrunch.com/2016/05/24/google-office-in-paris-is-being-raided-for-tax-noncompliance-reports-say/

Page 8 of 9

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media