fbpx

Search and big data analytics have evolved significantly over the last few years, and organizations are increasingly using these technologies to meet their mission-critical needs.

At the beginning of 2016, we were talking a lot about machine learning and semantic search and how they would be key developments in this space. 

Those have certainly been hot topics and continue to be areas that companies are seeking to exploit for their data-driven applications. 

But what will we be talking about in 2017 as it pertains to this space?

Here’s a look at five areas you can expect to hear more of this year and beyond.

Open Source Rises to the Top

Open source technologies are becoming more prominent in a wide range of use cases, from traditional enterprise search to log analytics, e-commerce search, and even government document search.

In fact, current data shows open source search engines have gained significant popularity because of flexibility, cost and features. According to DB-Engines, Elasticsearch and Solr — two open source search engines based on Lucene — top the list of leading commercial and open source search engines.

Just last month I discussed the features and limitations of Elasticsearch and Solr in this article.

With its growing use in the commercial and government space, the move to open source will continue to be a hot topic as organizations seek greater features, costs savings, and more flexibility with their search and big data analytics solutions.

Life Without Google Search Appliance

In early 2016, Google announced its end of support for the Google Search Appliance (by March 2019) as part of its strategic move to a cloud-based platform. Since then, many have begged the question: What now?

In my article last summer, I provided some tips on moving from the GSA and looked at some of the replacement options available at the time.

As we enter 2017, there are still no concrete details from Google about a new cloud-based search solution, so users are forging ahead with seeking out and comparing their existing alternatives.  With the March 2019 deadline getting closer, we’ll be hearing a lot more about the alternatives, and maybe even get word on Google’s cloud-based plans. 

Analytics Powered by Enterprise Data Lakes

Enterprises have a lot of data but how well they use it to derive insights is key to success. Over the last year, we’ve been hearing a lot of hype around enterprise data lakes (or enterprise data hubs) to bring together data silos and make the right data available to the right users at the right time. 

There is a wide variety of structured and unstructured data in enterprise data lakes. That said, search engines are the ideal tool for storing, processing, accessing, and presenting this data because they are schema-free and can scale to billions of records.

Data lakes’ search and analytics capabilities are nearly endless when we combine search engines, big data techniques, and visualization dashboards in groundbreaking use cases such as bioinformatics, precision agriculture, and precision medicine.

As data lakes continue to gain popularity as a way to store massive amounts of data and analytics, we’ll see organizations continuing to have the conversation in 2017 about how to best exploit this.

Search Engines Become 'Insight Engines1'

Just like Google, Cortana and Siri, search is becoming much more than just keyword matching.

We’re now heading into the age of search results personalization. Search engines are becoming personal digital assistants or as Gartner calls it, Insight Engines. This was made possible with big data analytics techniques like machine learning and predictive analytics.

It’s pervading the modern business world in a multitude of use cases from intranet search, to e-commerce search, recruiting, medical research, media and publishing, and many others. Organizations are just beginning to skim the surface on how real-time personalization is significantly enhancing their operations so we expect a lot of talk about how to implement this in the coming year.

Search Engine Scoring

We know that analyzing statistically valid scores helps increase search engine relevancy over time. But, while this method can significantly increase business value and the bottom line, not many organizations have started engine scoring or are implementing it effectively.

With that said, we are observing proven success with newer, better algorithms used in the scoring process, some of which are discussed in this article.  This is, without a doubt, a solid technique that will continue to grow and be discussed in the coming years as organizations seek to improve their user’s search experience.

In conclusion, with the rise of open source, massive volumes of structured and unstructured data, and the need to do complex analytics, we will continue to hear a lot on these topics throughout 2017 as search and big data continue to converge. 

Source : cmswire.com

Categorized in Search Engine

Now every company can easily monetize their data by letting everyone both inside and outside an organization use search to analyze their enterprise data in seconds.

New Extended Enterprise Edition lets customers and OEM partners easily embed ThoughtSpot's Relational Search into their apps, extranets, and B2B portals designed for employees, partners, or customers.

Next generation scale-out architecture combined with unlimited user licensing designed to support millions of users.

PALO ALTO, Calif. and GRAPEVINE, Texas, March 7, 2017 /PRNewswire/ -- ThoughtSpot, the leader in search-driven analytics for the enterprise, today announced the general availability of a new Embedded Analytics solution available with ThoughtSpot's new Extended Enterprise Edition. Recently recognized as a new entrant in the Gartner 2017 Magic Quadrant for Business Intelligence and Analytics Platforms report, ThoughtSpot is unveiling this new offering at the Gartner Data and Analytics Summit in Grapevine, TX this week.

ThoughtSpot's Embedded Analytics solution enables organizations to deliver insights to employees, partners, and customers across their extended ecosystem, by infusing search-driven analytics into their custom business applications, B2B portals, and workflows. Through the power of Relational Search, ThoughtSpot vastly expands the reach of business intelligence so everyone both inside and outside the enterprise can now benefit from a search-driven experience to analyze data in seconds.

Easy-to-use analytics capabilities are no longer a nice-to-have for modern software solution providers and enterprises alike. As massive amounts of data is being generated, it is vital for companies to provide data access to users beyond the enterprise and find new innovative ways to monetize their data. Employees, customers, and partners have long relied on centralized BI or development teams to build reports for data insights. Unfortunately, the further away from these constrained resources, the longer it takes to get answers to data-driven questions. Similarly, solution providers attempting a "build-first" approach to their analytics capabilities have incurred increased costs and longer time-to-market in launching their solutions.

ThoughtSpot Extended Enterprise employs a next-generation analytics architecture powered by Relational Search. This new breed of BI architecture and in-memory calculation engine makes it easy to analyze billions of rows of data across multiple data sources, while delivering sub-second performance and governance at scale. As a result, non-technical business people beyond the boundaries of the enterprise can search to find fast answers to their data questions without relying on data experts. Simple to build, with flexible deployment options and an architecture that scales out with ease, ThoughtSpot Extended Enterprise provides the agility and control so solution providers can get to market fast.

"ThoughtSpot's Embedded Search-Driven Analytics solution lets us deliver data insights directly to our customers in a way that's intuitive and incredibly fast," said Zahra Safavian, VP of Product at SSB Bart. "At SSB Bart, we are creating a world where digital technology can become a profound and empowering force in the lives of users with disabilities. With ThoughtSpot, we will make an impact even faster."

"Search-driven analytics is revolutionizing how employees quickly and easily analyze their company data. We're now thrilled to bring the same ease-of-use and performance-at-scale to our customers' customers and partners," said Ajeet Singh, co-founder and CEO of ThoughtSpot. "Now customers and partners can embed ThoughtSpot's Relational Search into any business application, extranet, or portal. It's now easier than ever for companies to monetize their data with human-scale analytics."

ThoughtSpot's Extended Enterprise capabilities include:

  • Seamless search-driven experience, infused into any application: Rather than waiting weeks or months for a backlogged BI or development team to create rigid charts and dashboards, ThoughtSpot's embedded search-driven approach enables every business person inside or outside the enterprise to use the power of search to get instant answers to their data questions, just as easily as they use search in their personal lives. They can ask as many questions as they want, and be able to answer that next question instantly, without incurring the costs of tapping into constrained data experts to update reports.
  • Fast query performance at scale: ThoughtSpot's next-generation analytics architecture is built from the ground up to deliver access to data at 'human-scale' and enable anyone to analyze their data in seconds. Its Relational Search Engine and built-in in-memory calculation engine makes it easy for millions of users to analyze billions of rows of data across multiple data sources, and delivers sub-second performance. No cubes, aggregations, summary tables or data marts are required to optimize for performance. As a result source data fidelity is maintained, enabling end users to flexibly ask questions on data at any level.
  • Centralized governance and security at scale: Expanding the reach of business intelligence well beyond enterprise boundaries requires a flexible and scalable governance framework that can scale to large numbers of users and security groups. ThoughtSpot provides granular column, object and row-level access control to a single shared data model across billions of records and scales to millions of users and hundreds of thousands of security groups. Fine-grained permissions ensure that only authorized users have access to the right set of data.
  • Flexible deployment options: ThoughtSpot offers many options to rapidly inject analytics in custom applications, portals or workflows. Embedding Relational Search delivers the full power of ThoughtSpot's search-driven experience into any application. Combine that with pluggable charts and dashboards to embed any visualization into an application's workflow. An extensive REST-based Data and Metadata API gives customers complete control to flexibly implement a custom look-and-feel on query result sets from ThoughtSpot. In addition, ThoughtSpot provides custom branding, white-labeling and OEM options to maintain a brand-consistent experience.

ThoughtSpot is a Premier Plus exhibitor at the Gartner Data and Analytics Summit in Grapevine, TX from March 6 - March 9, 2017 and is featuring its Extended Enterprise solution in an informative session on Tuesday, March 7, 2017 at 3:45pm CT as well as in Booth #413 in the Solution Showcase.

Pricing and Availability

ThoughtSpot Extended Enterprise is now generally available. ThoughtSpot software can be deployed on-premise or hosted in a cloud-based infrastructure such as Amazon Web Services. Price is based on data volume and includes unlimited end user licenses. For more information, or to request a demo, please visit http://www.thoughtspot.com/demo.

About ThoughtSpot

ThoughtSpot is a next generation analytics platform powered by the World's first Relational Search engine. ThoughtSpot's search-driven analytics lets business people build reports and dashboards in seconds. ThoughtSpot's new breed of BI architecture and in-memory calculation engine was built from the ground up to make it easy to analyze billions of rows of data across multiple data sources, while delivering sub-second performance and enterprise-wide governance. ThoughtSpot connects with any on-premise, cloud, big data, or desktop data source and deploys 85% faster than legacy technologies. With ThoughtSpot, BI & Analytics teams have cut their reporting backlogs by over 90% and enabled thousands of daily decisions throughout their organizations. ThoughtSpot has built the world's most advanced, yet easy-to-use number-crunching machine with a singular mission - to deliver access to data and insights at "human scale." For more information please visit thoughtspot.com.

Source : http://www.prnewswire.com/news-releases/thoughtspot-launches-embedded-search-driven-analytics-platform-300419404.html

Categorized in Search Engine

The holidays are all about tradition and bringing cheer to your family, friends, and clients. For me, that applies to the Google Analytics metrics I have come to know and love when sending my clients their monthly reports. For most of us, the holidays are the all-too-short-lived time where families gather ’round a tree, get and give gifts we may or may not use, and eat a harvest of homemade food. For others, the more data-driven geeks like myself, that slew of holiday treats takes second place to the festive findings of my Google Analytics dashboards.

Whatever you’re looking forward to this holiday season, I’ve got some easy Google Analytics reports for my SEO marketing friends that will make your extra two-and-half days off a little more om. Ahead, find 6 Google Analytics reports that brought me lots of cheer for this year during this cornucopias gourd filled fall season. Hint: You may want to add these to your wishlist!

Before I dive into the details of Google Analytics reports, I wanted to share an overview of factors and metrics I consider before building my report. These vary from person-to-person and brand-to-brand.

Before We Begin

First, what metrics align with your overall business goals? Google Analytics reports give you an overview of your website and business health. What metrics are important to your client? Here is a snapshot of what I take into consideration for the majority of my clients:

  • Organic Traffic tells you the number of users who have visited your site organically (unpaid) whether through Google, Bing, or another search engine.
  • Referral Traffic tells you the number of users who have visited your site outside of search engines. Referral traffic is also a good identifier of how your content and brand visibility is performing off-site.
  • Organic Conversions tells you how your content, landing pages, and user experience are performing.
  • Referral Conversions tells you the sites you are getting the most external traffic from. Yes, you can do this with UTM parameters, but referral conversions give you a wide scoping view.

Next, take a look at the timeframe you want to monitor. For example, I monitor reports on a 28-day period so each month is consistent. And, the same goes for quarterly. I use 90-day period for my quarterly reports.

Lastly, how do you plan to track your SEO growth? I prefer to track my progress week over week to compare a chosen period against the previous year. But, others prefer a snapshot view with no previous history data to compare.

Now, let’s dig in a little deeper to discover what SEO reports I use regularly to track my performance and show proof in SEO value.

1. Year-Over-Year Organic Traffic

How to Build This Report:

1. Go to Acquisition > All Traffic > Organic Search

yoy organic traffic_Google Analytics

2. Click Customize in the top left corner of the screen.

3. Create a new custom report named “Organic by Month.” The Metric Groups should read “Sessions,” “Goal Conversion Rate,” and “Goal Completions.” While “Month of Year” should be selected in Dimension Drilldowns. And, lastly, the Filters should include “Medium” + “Exact” + “Organic.”

yoy organic traffic custom report_Google Analytics

4. You should get a month-over-month (MoM) comparison.

MoM comparison in Google Analytics

5. Export each year to Google Sheets and combine the two sheets.

6. Then, you can create a Google chart to begin identifying trends and patterns year-over-year.

analytics year over year organic search in google sheets

What This Report Tells You:

The loss of organic traffic fear is real, especially when it comes to your personal website or client’s site. Surely, the best practice is to monitor your data regularly, checking real-time data, analyzing new vs. return visitors, or referral traffic. But once you see your organic traffic slowly drip into a downward pattern, are you sure you were monitoring the right data?

Unfortunately, “out of sight, out of mind” does not apply to tracking your organic traffic. Was the drop from seasonality? Or, did a competitor launch a new campaign?

As you can see from the example above, traffic is above the previous year, but it does decrease toward the end of the year. This helps us to normalize traffic each year. And, you can compare this year’s traffic as a percentage of last year’s traffic.

Here’s an example below:

Google Analytics Organic YoY Percentage Reports

The decrease in traffic has its surprise drops; it’s not very regular. There’s a significant decrease in May and again in October, which could be seasonality or SEO visibility. Regarding SEO visibility, this could be a change in J-SON, PPC, or keyword positioning. I’d suggest digging deeper to identify the concerns for your client.

Bonus Tip:

If you want to take your organic traffic year-over-year reports to the next level, upload your data from above into Distilled Forecaster tool. This tool will help you predict future traffic levels based on your historical data. And, it will help you map out your on-site technical SEO recommendations. For example, if you’re planning to redo the metadata in Q1 of next year, your changes may affect the overall site traffic.

Distilled_Forecaster SEO Organic traffic

2. Scroll Depth Tracking

How to Build This Report:

1. Your website must already be connected to Google Tag Manager.

2. Open Google Tag Manager, click “Add a New Tag.”

GTM Add New Tag

3. In Tag Configuration, click “Custom HTML.”

Custom HTML Tag Configuration in GTM

4. Copy and paste this code by Rob Flaherty.

<script>
/*!
 * @preserve
 * jquery.scrolldepth.js | v0.4.1
 * Copyright (c) 2014 Rob Flaherty (@robflaherty)
 * Licensed under the MIT and GPL licenses.
 */
;(function ( $, window, document, undefined ) {
   
  "use strict";
 
  var defaults = {
    elements: [],
    minHeight: 0,
    percentage: true,
    testing: false
  },
 
  $window = $(window),
  cache = [];
 
  /*
   * Plugin
   */
 
  $.scrollDepth = function(options) {
     
    var startTime = +new Date;
 
    options = $.extend({}, defaults, options);
 
    // Return early if document height is too small
    if ( $(document).height() < options.minHeight ) {
      return;
    }
	
     // Get some information about the current page
    var pageTitle = document.title;
	
    // Establish baseline (0% scroll)
    sendEvent(pageTitle,'Baseline');
 
    /*
     * Functions
     */

function sendEvent(action, label, timing) { if (!options.testing) { if (typeof(dataLayer) !== "undefined") { dataLayer.push({'event':'ScrollDistance', 'eventCategory':'Reading', 'eventAction':
action, 'eventLabel': label, 'eventValue': 1, 'eventNonInteraction': true}); if (arguments.length > 2) { dataLayer.push({'event':'ScrollTiming', 'eventCategory':'Reading', 'eventAction':
action, 'eventLabel': label, 'eventTiming': timing}); } } else { if (typeof(ga) !== "undefined") { ga('send', 'event', 'Reading', action, label, 1, {'nonInteraction': 1}); if (arguments.length > 2) { ga('send', 'timing', 'Reading', action, timing, label); } } if (typeof(_gaq) !== "undefined") { _gaq.push(['_trackEvent', 'Reading', action, label, 1, true]); if (arguments.length > 2) { _gaq.push(['_trackTiming', 'Reading', action, timing, label, 100]); } } } } else { $('#console').html(action + ': ' + label); } } function calculateMarks(docHeight) { return { '25%' : parseInt(docHeight * 0.25, 10), '50%' : parseInt(docHeight * 0.50, 10), '75%' : parseInt(docHeight * 0.75, 10), // 1px cushion to trigger 100% event in iOS '100%': docHeight - 1 }; } function checkMarks(marks, scrollDistance, timing) { // Check each active mark $.each(marks, function(key, val) { if ( $.inArray(key, cache) === -1 && scrollDistance >= val ) { sendEvent(pageTitle, key, timing); cache.push(key); } }); } function checkElements(elements, scrollDistance, timing) { $.each(elements, function(index, elem) { if ( $.inArray(elem, cache) === -1 && $(elem).length ) { if ( scrollDistance >= $(elem).offset().top ) { sendEvent('Elements', elem, timing); cache.push(elem); } } }); } /* * Throttle function borrowed from: * Underscore.js 1.5.2 * http://underscorejs.org * (c) 2009-2013 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Underscore may be freely distributed under the MIT license. */ function throttle(func, wait) { var context, args, result; var timeout = null; var previous = 0; var later = function() { previous = new Date; timeout = null; result = func.apply(context, args); }; return function() { var now = new Date; if (!previous) previous = now; var remaining = wait - (now - previous); context = this; args = arguments; if (remaining <= 0) { clearTimeout(timeout); timeout = null; previous = now; result = func.apply(context, args); } else if (!timeout) { timeout = setTimeout(later, remaining); } return result; }; } /* * Scroll Event */ $window.on('scroll.scrollDepth', throttle(function() { /* * We calculate document and window height on each scroll event to * account for dynamic DOM changes. */ var docHeight = $(document).height(), winHeight = window.innerHeight ? window.innerHeight : $window.height(), scrollDistance = $window.scrollTop() + winHeight, // Recalculate percentage marks marks = calculateMarks(docHeight), // Timing timing = +new Date - startTime; // If all marks already hit, unbind scroll event if (cache.length >= 4 + options.elements.length) { $window.off('scroll.scrollDepth'); return; } // Check specified DOM elements if (options.elements) { checkElements(options.elements, scrollDistance, timing); } // Check standard marks if (options.percentage) { checkMarks(marks, scrollDistance, timing); } }, 500)); }; })( jQuery, window, document ); jQuery.scrollDepth(); </script>


GitHub has some additional data on coding your scroll depth.

2. Choose how to display your scroll depth. Some users prefer percentages (10%, 25%, 50%, 75%, 100%) and others choose labels like Article Loaded, Start Reading, Content Bottom, etc. This particular code uses percentages, but you can change the naming conventions if needed. Here is the scroll tracking code for that by Justin Cutroi:

<script>
jQuery(function($) {
    // Debug flag
    var debugMode = false;

    // Default time delay before checking location
    var callBackTime = 100;

    // # px before tracking a reader
    var readerLocation = 150;

    // Set some flags for tracking & execution
    var timer = 0;
    var scroller = false;
    var endContent = false;
    var didComplete = false;

    // Set some time variables to calculate reading time
    var startTime = new Date();
    var beginning = startTime.getTime();
    var totalTime = 0;
    
    // Get some information about the current page
    var pageTitle = document.title;

    // Track the aticle load
    if (!debugMode) {
        ga('send', 'event', 'Reading', pageTitle,'Article Loaded', {'nonInteraction': 1});
    } else {
        alert('The page has loaded. Woohoo.');    
    }

    // Check the location and track user
    function trackLocation() {
        bottom = $(window).height() + $(window).scrollTop();
        height = $(document).height();

        // If user starts to scroll send an event
        if (bottom > readerLocation && !scroller) {
            currentTime = new Date();
            scrollStart = currentTime.getTime();
            timeToScroll = Math.round((scrollStart - beginning) / 1000);
            if (!debugMode) {
                ga('send', 'event', 'Reading', pageTitle,'Start Reading', timeToScroll, {'metric1' : 
timeToScroll}); } else { alert('started reading ' + timeToScroll); } scroller = true; } // If user has hit the bottom of the content send an event if (bottom >= $("#authorTemplate").scrollTop() + $("#authorTemplate").innerHeight()
&& !endContent) { currentTime = new Date(); contentScrollEnd = currentTime.getTime(); timeToContentEnd = Math.round((contentScrollEnd - scrollStart) / 1000); if (!debugMode) { if (timeToContentEnd < 60) { ga('set', 'dimension1', 'Scanner'); } else { ga('set', 'dimension1', 'Reader'); } ga('send', 'event', 'Reading',pageTitle,'Content Bottom', timeToContentEnd,
{'metric2' : timeToContentEnd}); } else { alert('end content section '+timeToContentEnd); } endContent = true; } // If user has hit the bottom of page send an event if (bottom >= height && !didComplete) { currentTime = new Date(); end = currentTime.getTime(); totalTime = Math.round((end - scrollStart) / 1000); if (!debugMode) { ga('send', 'event', 'Reading', pageTitle,'Page Bottom', totalTime, {'metric3' : totalTime}); } else { alert('bottom of page '+totalTime); } didComplete = true; } } // Track the scrolling and track location $(window).scroll(function() { if (timer) { clearTimeout(timer); } // Use a buffer so we don't call trackLocation too often. timer = setTimeout(trackLocation, callBackTime); }); }); </script>

3. Choose how you want the scroll depth to work. Do you want to send an event after the user scrolls 200 pixels? You can change the value to whatever works best for you. If you’re comfortable with coding, you can edit this in code in this area above (just be sure to select the remaining areas):

ga('send', 'event', 'Reading', pageTitle,'Article Loaded', {'nonInteraction': 1});


4. Under “Triggers,” add “All Page View.”

GTM triggering for scroll depth tracking
6. Go to Variables > New (under User-Defined Variables) > Data Layer Variable. Then, add eventCategory. And repeat this step for eventAction; eventLabel; eventValue.

GTM variables data layer variable

data layer variable in GTM

7. Head back to the Tags > New > Google Analytics (choose depending on Classic or Universal). It should look something similar to the below:

google analytics tracking code in GTM

9. Under that same Tag, go to Triggers > Custom Event. Then, give it a name.

GTM custom events scroll depth

To make sure your GTM code is firing view, go to Behavior > Events > Top Events > Scroll Tracking in Google Analytics. It should look something like this:

event tracking percentages in Google Analytics

If this sounds like mumbo-jumbo to you, then stick to something like Crazy Egg’s Scrollmap tool.

What This Report Tells You:

Let’s just come out and say it — do you know if people are reading your content? Or, looking at all your products?

Once you add Scroll Depth tracking, you can identify if users are making it to the bottom of your page. Scroll Depth tracking is essential if you have a one-page website or a super long great homepage. Knowing your scroll depth can help structure and prioritize relevant site information at the top if the majority of users aren’t scrolling down. For example, if your higher priced products are at the bottom, you may want to reorganize them to mingle with the lower priced items. Scroll Depth tracking allows you further develop your pages with proper content placement.

Bonus Tip:

You want to pair Scroll Depth tracking with other metrics like advanced click event tracking, pageviews, bounce rate, time on page, and other engagement metrics to track interactions and performance. You can set this up with micro conversions in Google Analytics. Your goal should look something like this:

Now, you can see that every session (or visit) that a user scrolls more than 75% is now recorded as a conversion. This tells me these users are engaged in site design and content.

3. Micro Conversion Tracking

How to Build This Report:

  1. Identify your micro conversions. This is different for every site. I would suggest creating micro conversions if you have downloadable files, a newsletter, create an account or an add to cart function. You may need to create a ‘Thank You’ page to begin tracking these conversions. Here’s an example from SEJ.
  2. Set up goal tracking in Google Analytics. Again, depending on your micro conversion, this process will be different.
  3. Analytics Set Up Goals
  4. Verify your goal and watch the numbers start rolling in.

What This Report Tells You:

Micro conversion tracking shows you the steps a user takes before converting. It will allow you to see patterns that users frequently do before they purchase your product or contact you.

Measuring micro conversions are just as important as measuring macro conversions (the purchase). The more know what the users are doing before they buy, the more you can give them and optimize your site further.

Here are a few examples of micro conversions:

  • Comment on an Instagram post
  • Sign-up for your newsletter
  • Read a blog post
  • Download a size chart

Here are some ideas of macro conversions:

  • Purchasing a product
  • Contacting us

Micro conversions help lead you down the conversion funnel more accurately. Glenn Gabe talks more about conversion goals in Google Analytics.

Bonus Tip:

Once you have a good grasp on micro conversions, start assigning a dollar value to the micro conversions.

For example, if 100 users are signing up for your newsletter and 50 users to watch your product videos, but users who watch your product videos are more likely buy your product, they have a higher conversion rate. Once you have the conversion rate, you can attach an average order value to determine where to spend your budget.

Here’s an example of the formula I would use:

Average value of watching a product video = Revenue generated by users who watched the video / number of videos.

4. Organic Landing Page Traffic

How to Build This Report:

  1. Head over to Google Analytics > Acquisition > Channels > Organic Search.
  2. Add the secondary dimension of “Landing Page.”

google analytics landing page

What This Report Tells You:

When you’re able to match up your keyword terms with your landing pages, you’re able to work around the (not provided) issue. Now, instead of compensating for keyword data, you’re getting a holistic view of how pages on your website are performing. If your click-through rate is low and your bounce rate is high, you may want to consider working with a web designer to update your site. Or, if your bounce rate is high (depending on the site, I’m looking for anything over 75%), then you may want to consider rewriting the page to reflect more accurate search queries.

Bonus Tip:

If you take this a step further by using the “Landing Page” as your primary dimension. Then, add “Page Depth” as your secondary dimension. With this view, you’ll have an even better perception on which pages are driving high-quality traffic. Understanding what pages are pushing users to click around your site more will help you grasp what type of content your users want.

5. Multi-channel Funnels Assisted Conversions

How to Build This Report:

  1. In Google Analytics, go to Conversions > Assisted Conversions.

google analytics assisted conversions
2. Create a new segment including any interaction from Organic Search and excluding the last interaction from Organic Search.

google analytics organic search segment

What This Report Tells You:

Multi-channel Funnel (MCF) reports helping you identify first and last click attribution. In a short version, MCF reports show you what interactions your customers are taking before they complete a goal.

MCF reports are impressive for direct traffic since direct traffic isn’t correctly attributed in the main channel reports. They give you a wider scope of what is happening on your site. As an SEO consultant, I want to know what campaigns are contributing to the success or failure of my client’s website. SEO converts users, but it also assists in conversions, so I want to know what’s working and what’s not, especially for my e-commerce clients. Typically with e-commerce, I won’t see a first-time visitor convert, but with MCF reporting I can more accurately evaluate what channels are working.

When you add the new segment to determine the true value of organic search, you can see the assisted conversions and the assisted conversion value.

Bonus Tip:

It’s important to note that MCF tracking only gives us 30 days while Acquisition reports give you way more data. To fix this, change your default settings to 90 days. It will give you more data to work with and let you look at the bigger picture.

90 days google analytics assisted conversions

6. Organic Traffic Keyword Value

How to Build This Report:

  1. In Google Analytics, go to Behavior > Site Search > Search Terms and export the data. If you have Google Search Console connected, you can also check out Acquisition > Search Console > Queries to connect the dots in your data.
  2. Go to Google Keyword Planner and upload your data under “Get search volume data and trends.”
  3. google keyword search and google analytics
  4. Use the “Suggested bid” to give value to your search terms.

What This Report Tells You:

Giving a dollar value to your keyword terms that are already driving traffic to your site helps your client visualize the worth of your SEO strategy. It’s one thing to tell a client about their domain authority, traffic, and conversions from organic traffic, but it’s a whole other ball game when you can show your blog post positioned around a keyword term brought in the majority of money last month.

Bonus Tip:

To display this data, create a spreadsheet listing impressions, clicks, cost-per-click (from the ‘Suggested bid” above) and total value.

Keyword Organic Value

Your Turn

Okay, you know the drill: It’s the last month before the end of the year, holiday marketing campaigns are rolling in. And yes, this is a thrilling time, but there is a certain level of stress that comes with analyzing the Q4 data. It can be especially stressful for those who aren’t familiar with all the tips and tricks of Google Analytics reporting, which, let’s be honest, was all of us at some point.

These Google Analytics reports are available for free and are perfect for all of my favorite fall holidays. Because, supposedly, holidays are all about giving thanks. From tracking SEO performance to assisted conversions, I am thankful for these Google Analytics reports and all the delicious data they send my way. The reports above are just a few to help you get started.

Thanks so much for reading! I hope that gives you something to be cheerful for this holiday season. I’d love to know what reports and metrics you look at in Google Analytics. What’s important to you? What reports do you clients find most beneficial? I’d love to hear from you in the comments below.

Pinterest Graphic: 6 Google Analytics Reports That Will Bring Every SEO Lots of Cheer

Author: Anna Crowe
Source: https://www.searchenginejournal.com/google-analytics-reports-holiday/179077

Categorized in Search Engine

The big data revolution is upon us. Firms are scrambling to hire a new brand of analysts dubbed “data scientists,” and universities have responded to this demand by introducing data science courses into degrees ranging from computer science to business. Survey-based reports find that firms are currently spending an estimated $36 billion on storage and infrastructure, and that is expected to double by 2020.

Once companies are logging and storing detailed data on all their customer engagements and internal processes, what’s next? Presumably, firms are investing in big data infrastructure because they believe that it offers a positive return on investment. However, looking at the surveys and consulting reports, it is unclear what the precise use cases are that will drive this positive ROI from big data.

Our goal in this article is to offer specific, real-world case studies to show how big data has provided value for companies that have worked with Microsoft’s analytics teams. These cases reveal the circumstances in which big data predictive analytics are likely to enable novel and high-value solutions, and the situations where the gains are likely to be minimal.

Predicting demand. The first use case involves predicting demand for consumer products that are in the “long tail” of consumption. Firms value accurate demand forecasts because inventory is expensive to keep on shelves and stockouts are detrimental to both short-term revenue and long-term customer engagement. Aggregated total sales is a poor proxy because firms need to distribute inventory geographically, necessitating hyperlocal forecasts. The traditional way of solving this problem is using time-series econometrics with historical sales data. This method works well for popular products in large regions but tends to fail when data gets thin because random noise overwhelms the underlying signal.

A big data solution to this problem is to use anonymized and aggregated web search or sentiment data linked to each store’s location on top of the existing time-series data. Microsoft data scientists have employed this approach to help a forecasting firm predict auto sales. Building models with web search data as one of the inputs reduces mean absolute forecast error, a standard measure of prediction accuracy, for monthly national sales predictions on the order of 40% from baseline for auto makes with relatively small market shares, compared to traditional time-series models. Although the gains were smaller for the most popular models at the national level, the relative improvement increases as one drills down to the regional level.

In this case, the big data solution leverages the previously unused data point that people do a considerable amount of social inquiry and research online before buying a car. The increased prediction accuracy, in turn, makes it possible to achieve large increases in operational efficiency — having the right inventory in the right locations.

Anonymized web search data has proven to be helpful for other forecasts as well since online activity often is a good leading proxy for purchases and actions of the general public. Having the additional data is insufficient on its own. Processing search data and combining it with traditional sources is vital in creating a successful prediction: We found that raw search query volume is insufficient in parsing out the signals that correlate to true product demand.

Being intelligent about which signals to draw from big data requires care, and best practices can be case-specific. For example, single queries from a user might be less important than multiple queries from a user. Although we used search data in this case study, a firm could just as easily use the location of users visiting their website or link detailed sales data to a customer’s location.

Improved pricing. Using a single price is economically inefficient because part of the demand curve that could be profitably served is priced out of the market. As a consequence, firms regularly offer targeted discounts, promotions, and segment-based pricing to target different consumers. E-commerce websites have a distinct advantage in pursuing such an approach because they log detailed information on customer browsing, not just the goods they end up purchasing, and aggressively adjust prices over time. These price adjustments are a form of experimentation and, jointly with big data, allow firms to learn more about their customers’ price responsiveness.

Offline retailers can mimic e-commerce’s nuanced pricing strategies by tracking consumers through smartphone connectivity and logging which customers enter the store, what type of goods they look at, and whether they make a purchase. Machine learning applied to this data can algorithmically generate customer segments based on price responsiveness and preferences, which generally offers a large improvement on traditional demographic-based targeting.

Our experience with pricing advertising on the Bing search engine is that using big data can produce substantial gains by better matching advertisers to consumers. The success of algorithmic targeting has been well documented and is a key driver of revenue in online advertising market. Advances in measurement technology increasingly allow offline firms to benefit from these types of gains through more efficient pricing.

Predictive maintenance. Smoothly operating supply chains are vital for stable profits. Machine downtime imposes a cost to firms due to forgone productivity and can be particularly disruptive in both complex manufacturing supply chains and consumer products. Executives in asset-intensive industries often state that the primary operational risk to their businesses is unexpected failures of their assets. A wave of new data generated by the “internet of things” (IoT) can provide real-time telemetry on detailed aspects of production processes. Machine-learning models trained on these data allow firms to predict when different machines will fail.

Airlines are particularly interested in predicting mechanical failures in advance so that they can reduce flight delays or cancellations. Microsoft data scientists from the Cortana Intelligence Suite team are able to predict the probability of aircrafts being delayed or canceled in the future based on relevant data sources, such as maintenance history and flight route information. A machine-learning solution based on historical data and applied in real time predicts the type of mechanical issue that will result in a delay or cancellation of a flight within the next 24 hours, allowing the airlines to take maintenance actions while the aircrafts are being serviced, thus preventing possible delays or cancellations.

Similar predictive-maintenance solutions are also built in other industries — for example, tracking real-time telemetry data to predict the remaining useful life of an aircraft engine, using sensor data to predict the failure of an ATM cash withdrawal transaction, employing telemetry data to predict the failure of electric submersible pumps used to extract crude in the oil and gas industry, predicting the failures of circuit boards at early stages in the manufacturing process, predicting credit defaults, and forecasting energy demand in hyperlocal regions to predict the overload situations of energy grids. Machine learning will make supply chains less brittle and reduce the effects of disruptions for many goods and services.

These cases help highlight a few general principles:

  • The value derived from the analytics piece can greatly exceed the cost of the infrastructure. This indicates there will be strong growth in big data consulting services and specialized roles within firms.
  • Big data is less about size and more about introducing fundamentally new information to prediction and decision processes. This information matters most when existing data sources are insufficient to provide accurate or actionable predictions — for example, due to small sample sizes or coarseness of historical sales (small effective regions, niche products, new offerings, etc.).
  • The new information is often buried in detailed and relatively unstructured data logs (known as a “data lake”), and techniques from computer science are needed to extract insights from it. To leverage big data, it is vital to have talented data engineers, statisticians, and behavioral scientists working in tandem. “Data scientist” is often used to refer to someone who has these three skills, but in our experience single individuals rarely have all three.

Radically new applications. The cases that we’ve discussed concern how big data can be employed to improve existing processes (e.g., more-precise demand forecasts, better price sensitivity estimates, better predictions of machine failure). But it also has the potential to be applied in ways that disrupt existing processes. For example, machine-learning models taking massive data sets as inputs, coupled with clever designs that account for patient histories, have to the potential to revolutionize how certain diseases are diagnosed and treated. Another example involves matching distributed electricity generation (e.g., solar panels on roofs) to localized electricity demand, unlocking huge value by equating electricity supply and demand with more-efficient generation.

The value described from predicting demand more accurately, better pricing, and predictive maintenance are the specific use cases that easily justify large firms’ investments in big data infrastructure and data science. These uses are likely to drive value of the same order of magnitude as the investments. The value of radically new applications is challenging to understand ex ante and speculative by nature. It is reasonable to expect losses for many firms, due to uncertain and higher risk investments, with a few firms earning spectacular profits.

Author:  Jacob LaRiviere, Preston McAfee, Justin Rao, Vijay K. Narayanan, and Walter Sun

Source:  https://hbr.org/2016/05/where-predictive-analytics-is-having-the-biggest-impact

Categorized in Search Engine

The American Civil Liberties Union recently uncovered evidence that led Twitter, Facebook and its Instagram subsidiary to stop sharing data with Geofeedia, a firm accused of improperly collecting social media data on protest groups, and sharing that information with numerous law enforcement agencies.

Geofeedia, a developer of location-based analytics, had been marketing its technology to law enforcement agencies. It was used for such purposes as monitoring Black Lives Matter protests in Ferguson, Missouri, and Baltimore, Maryland, according to the ACLU.

The ACLU of Northern California uncovered the practice after requesting public records information from 63 law enforcement agencies in California.

The documents revealed that Instagram had provided Geofeedia access to streams of user posts, called the "Instagram API," until that practice was terminated last month, according to Matt Cagle, technology and civil liberties policy attorney for the ACLU of Northern California.

The data also shows that Facebook provided Geofeedia access to its Topic Feed API, which is supposed to be used for media and branding purposes, according to the ACLU. The API gave the firm access to a ranked feed of public posts that mention a specific topic.

API Access

Geofeedia had access to the Facebook's API source information, said Facebook spokesperson Jodi Seth.Using APIs the way Geofeedia did is a "violation of our platform policies, which prohibit the sale or transfer of data," she told TechNewsWorld.

"This developer only had access to data that people chose to make public," Facebook said in a statement. "Its access was subject to the limitations in our Platform Policy, which outlines what we expect from developers that receive data using the Facebook Platform. If a developer uses our APIs in a way that has not been authorized, we will take swift action to stop them and we will end our relationship altogether if necessary." 

Facebook terminated Geofeedia's access to its APIs last month, after learning about the infractions, Seth said.

While not providing access to its Firehose technology, Twitter did allow a subsidiary to provide Geofeedia with searchable access to public tweets, the ACLU said.

Twitter earlier this year added contract language designed to protect users against further surveillance techniques, the organization noted.

Based on information in the ACLU report, Twitter suspended @Geofeedia's commercial access to Twitter data.

The ACLU's Cagle acknowledges in a post on the organization's site that "neither Facebook nor Instagram has a public policy specifically prohibiting developers from exploiting user data for surveillance purposes," Twitter spokesperson Nu Wexler pointed out to TechNewsWorld.

The ACLU post goes on to say that "Twitter does have a 'longstanding rule' prohibiting the sale of user data for surveillance as well as a developer policy that bans the use of Twitter data to 'investigate, track or surveil Twitter users.'"

Twitter this spring cut off U.S. intelligence agencies from access to Dataminr, a firm that scans social media activity for information on potential terrorist attacks and political unrest, Wexler noted, pointing to a Wall Street Journalstory published in May.

Targeted Protesters

Facebook severed its agreement with Geofeedia because it violated Facebook's data-sharing policies, noted Brandi Collins, campaign director of Color of Change, which had joined the ACLU and the Center for Justice in making the document request.

Facebook's decision to abandon the agreement suggests that the methods Geofeedia was employing were illegal, Collins told TechNewsWorld.

"More broadly, we should be concerned that police departments are wasting critical public resources on monitoring the social media profiles of the people in their communities, they're supposed to be protecting," she said.

"Geofeedia brags about its success monitoring protesters in Ferguson," Collins remarked, "but how does tracking people who are protesting police killings of unarmed black people make any of us safe?" 

Source : technewsworld

Categorized in Market Research

The hotel and hospitality sector caters to millions of travellers every day, and each one of them checks in with their own set of expectations. Meeting those expectations is the key to getting people to return, and increasingly hotel and leisure operators are turning to advanced analytics solutions for clues about how to keep their customers happy.

Additionally, although their marketing departments would be loathe admit it, not all guests are equal in the eyes of hotel and leisure operators. Some will simply check in and check out with a minimum of fuss. Others will spend hundreds or thousands of dollars on fine dining, entertainments, sports activities and spa treatments. Identifying those customers with a higher overall lifetime value to a particular business is hugely important in today’s market, but a customer’s lifetime value might not be empirically obvious from observing their behavior during one visit.

Advertisment

become-an-internet-research-specialist

For example, a high-rolling customer spending money like it is going out of fashion in the hotel casino may be on a “holiday of a lifetime” following retirement, and unlikely to behave in this way every day. Meanwhile a frugal business customer taking an economy room and spending very little on extra services may be a travelling businessman who will potentially return frequently if the hotel meets his needs, and therefore have a higher lifetime value. Big Data analytics can help make this distinction.

A third overarching use of analytics in the hotel industry revolves around “yield management”. This is the process of ensuring that each room attracts the optimal price – taking into account troughs and peaks in demand throughout the year as well as other factors, such as weather and local events, which can influence the number (and type) of guests checking in.

Analytics has applications in all of these areas and although the hotel and hospitality sector has lagged behind others such as retail and manufacturing in adopting an analytics-first philosophy, that could be starting to change.Hotels are now using big data and analytics (shutterstock_217868887)

One pioneering example included US economy hotel chain Red Roof Inn who, during the record-setting winter of 2013/2014, realized the huge value of having a number of hotels close to major airports at a time when flight cancellation rate was around 3%. This meant around 90,000 passengers were being left stranded every day. The chain’s marketing and analytics team worked together to identify openly available public datasets on weather conditions and flight cancellations. Knowing that most of their customers would use web search on mobile devices to search for nearby accommodation, a targeted marketing campaign was launched, aimed at mobile device users in the geographical areas most likely to be affected. This led to a 10% increase in business in areas where the strategy was deployed.

Another US chain which has been recognized for their innovative use of analytics is Denihan Hospitality, which owns boutique hotels across the US including the James and Affinia Hotels brands. Denihan used IBM analytics technology to bring together transactional and customer data across its chains, and combine it with unstructured data such as customer feedback comments and reviews left on rating sites such as Tripadvisor. Menka Uttamchandani, the company’s vice president of business intelligence, said “Every company has massive amounts of data – it is what one does with that data – such as providing relevant dashboards, click through deep dive actionable reporting and analytical insight that can foster a competitive edge.” 

After evaluating customer feedback and transactional data, the chain took strategic, data-driven decisions to rearrange many of their rooms to better cater to either business or leisure travellers, provide more bathroom storage for rooms popular with travelling families, and provide a greater range of in-room facilities such as kitchenettes where guests would appreciate them.

The chain even went as far as putting analytics in the hands of the frontline hotel staff, who were armed with dashboards on their smartphones enabling them to anticipate what a particular guest might expect or desire from their stay, in terms of restaurant meals, concierge services or excursions to local places of interest. Housekeeping staff receive real-time updates on whether customers in a particular room require an extra pillow or are likely to call room service for a sandwich and a coffee at 2am.

Of course as in most industries, a majority of analytical work in the hospitality industry is focused on marketing. The overall aim is often to launch personalized marketing campaigns in the form of email or targeted social media advertising. This involves analyzing all of the information available about customers who are visiting, by gathering customer feedback, transactional activity, use of loyalty programs and bought-in third party demographic data. This is then used to decide whether, for example, an offer of a free restaurant meal, or a ticket for a show at a nearby theatre is more likely to persuade a high lifetime-value customer to make a booking.

At Marriott, however, Big Data is not confined to marketing, and has been put to use across the hotel chain’s operations. Unstructured and semi-structured datasets such as weather reports and local events schedules are used to forecast demand and determine a value for each individual room throughout the year. This enables the chain to set prices with optimum efficiency – vital in an age where customers are used to saving pennies by scanning price comparison services for the best deals. 

Starwood Hotels and Resorts, which owns 1,200 hotels around the world, is another large chain which as heavily invested in Big Data and analytics. Their system, too, is based around optimizing room pricing by analyzing data on local and worldwide economic factors, events and weather reports. Knowing how the home weather of their core customer base in North America impacts the price that those customers are willing to spend for a week in the Caribbean sunshine gives them prompts as to the best time to reduce prices or launch marketing promotions. This strategy has led to an increase in its revenue-per-room – a key metric for hotels – of almost 5%.

The hotel and hospitality industry may just be starting out with Big Data but it has an enviable volume and variety of data to work with. Customers leave a data trail from the moment they book to the moment they check out, and analysts are beginning to get to grips with turning that data into actionable insights. Once it gets into its stride, I expect we will see more innovation from this particular sector which should result in more satisfying stays for us as customers.

Source : forbes

Categorized in Online Research

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media

Book Your Seat for Webinar - GET 70% OFF FOR MEMBERS ONLY      Register Now