Our tablets and mobile phones are amazingly agile in running a wide array of apps. Just head over to your app store of choice, jump on some WiFi and the downloading frenzy can begin. Your phone probably defaults to storing those little icons all over your home screens. While this is a fantastic resource, it can lead to a hot mess. Fortunately, you don’t have to live through the chaos. Here are some ideas about how to organize all those incredible applications.

1. Action Categories

If you need to look something up in Wikipedia and listen to your iTunes, why not center your organization on these concepts? All you need to do is create folders which reflect the best action word associated with the apps

Action Organization

 

2. Color Codoing

If you are a really visual learner, your best organizational scheme might center around color. We all know Snapchat features a primarily yellow icon and facebook is largely blue. Thus we can drag Facebook next to twitter ad Snapchat alongside Apple Maps and BAM color code achieved. This is a great method for those of us who relate first to the application image rather than its function.

color coding

3. Frequency Used

We all have a few apps we rely on almost everyday. If you want to minimize time spent searching for icons and maximize time spent getting the information you love to have, organization by frequency is a great option. One way to accomplish this is by assigning each home screen to a level of frequency. The first screen can include the items you use everyday. Swipe once and find the items used a few times a week. Swipe another time and find the lesser used apps.

screen frequency

4. Themed Rows

Remember back in college when your favorite club would have themed meetings? Everyone would come dressed in their pajamas or favorite Hawaiian shirts? Well, you can relive some of those fond memories by organizing your apps around central themes. Instead of pajama days, you can assign each row its own theme. For example, you may have a maps row, a social media row and a knowledge base row.

Themed Rows

5. Break out the Widgets

Widgets, primarily used on Android, are a great way to quickly access a lot of information. By plopping a widget onto one of your home screens, you can creatively manage space. Widgets are a great organizational tool for those who want to collect key information without additional clicks.

Widgets

6. One Central Home Screen with Folders

One of the great things about modern phones and tablets is their flexibility with number of home screens. If you like to swipe a lot, you can have multiple screens. However, if you prefer a simpler start point, you can center everything on one screen and fit everything in via folders.

One Screen with Folders

7. SmartBar

This android tool combines several features in an easy to access centralized manner. Rather than having to click through to find the app you need to perform the needed task on your phone, you can simply access it in a click or two with SmartBar. With this tool set up on your homescreen, you can organize the applications you want around this powerful feature.

SmartBar

 

8. Hand Position

Another simple method of organizing your applications is ease of use when holding. Everyone prefers to hold their phone in a slightly different way. Given this particular position is likely to be the configuration used to open most applications, it can be a useful organizational tool. Simply place the apps you use most often closest to the finger you use most of operate the phone. Whether this is your thumb or index finger, this organizational scheme can increase the speed at which you operate your phone.

Hand poistion

 Source: lifehack.org

Categorized in Science & Tech

 

As of late June, 32.5% of page one Google results now use the HTTPS protocol, according to a new study from Moz.The esteemed Dr Pete published a blog post this week on the data they’ve been tracking in the two year period since Google announced HTTPS was to be a light ranking signal in August 2014.

The results are definitely enough to give SEOs pause for thought when it comes to considering whether to switch their sites to a secure protocol.

What is HTTPS?

In case you need a refresher, here is Jim Yu’s explanation of the difference between http and HTTPS:

HTTP is the standard form used when accessing websites. HTTPS adds an additional layer of security by encrypting in SSL and sharing a key with the destination server that is difficult to hack.

And here is Google’s 2014 announcement:

“We’re starting to use HTTPS as a ranking signal. For now, it’s only a very lightweight signal, affecting fewer than 1% of global queries, and carrying less weight than other signals, such as high-quality content.”
But over time, the promise that Google would strengthen the signal “to keep everyone safe on the Web” seems to be coming true…

HTTPS as a ranking signal in 2014

Searchmetrics found little difference between HTTP and HTTPS rankings in the months after the initial Google announcement. Hardly surprising as it did only affect 1% of results.Moz also saw very little initial difference. Prior to August 2014, 7% of page one Google results used HTTPS protocol. A week after the update announcement, that number increased to 8%.

So we all went about our business, some of us implemented, some of us didn’t. No big whoop. It’s not like it’s AMP or anything! Amirite?

SMASH CUT TO:

HTTPS as a ranking signal in 2016

Moz has found that one-third of page one Google results now use HTTPS.

moz https results

 

As Dr Pete points out, due to the gradual progression of the graph, this probably isn’t due to specific algorithm changes as you would normally see sharp jumps and plateaus. Instead it may mean that Google’s pro-HTTPS campaign has been working.

“They’ve successfully led search marketers and site owners to believe that HTTPS will be rewarded, and this has drastically sped up the shift.”
Projecting forward it’s likely that in 16–17 months time, HTTPS results may hit 50% and Dr Pete predicts an algorithm change to further bolster HTTPS in about a year.

Source:  https://searchenginewatch.com/2016/07/07/https-websites-account-for-30-of-all-google-search-results/

 

 

Categorized in Search Engine

Microsoft has been in the operating system game for decades, but now it's making a move in the web browser wars.

Microsoft Edge, the first new browser interface and engine of this decade, comes with every shipping copy of Windows 10. Microsoft says it has "tens of millions of users," but it's barely a blip on most web browser market share reports.

Even so, Microsoft has used the telemetry provided by those millions of users and an exhaustive battery of tests to prove that Microsoft Edge is actually a more battery-efficient browser than Google Chrome, Firefox and Opera.

In a pair of blog posts published on Monday, Microsoft engineers outline how the current Edge browser can save up to 53 percent of your battery life on a Windows 10 system, as compared to other web browsers like Chrome and Firefox. The second, more technical post promises that the next Edge browser, which will ship with the upcoming Windows 10 Anniversary Update, will be even more energy efficient.

MONTH CHROME INTERNET EXPLORER FIREFOX SAFARI MICROSOFT EDGE OTHER

July, 2015 27.82% 53.13% 12.03% 5.09% 0.14% 1.79%
August, 2015 29.49% 50.15% 11.68% 4.97% 2.03% 1.69%
September, 2015 29.86% 49.19% 11.46% 5.08% 2.41% 2.00%
October, 2015 31.12% 48.20% 11.28% 5.01% 2.67% 1.72%
November, 2015 31.41% 47.21% 12.24% 4.33% 2.90% 1.91%
December, 2015 32.33% 46.32% 12.13% 4.49% 2.79% 1.95%
January, 2016 35.05% 43.82% 11.42% 4.64% 3.07% 2.00%
February, 2016 36.56% 40.85% 11.68% 4.88% 3.94% 2.08%
March, 2016 39.09% 39.10% 10.54% 4.87% 4.32% 2.09%
April, 2016 41.71% 36.61% 10.06% 4.47% 4.73% 2.42%
May, 2016 45.63% 33.71% 8.91% 4.69% 4.99% 2.07%

Jason Weber, Microsoft's director of program management for Microsoft Edge, explained that different web browsers consume energy in very different ways, much like cars don't all consume gas in the same way. Some are more efficient than others. Some run like they're always in the city while others operate in a more efficient highway miles mode.

Weber contends that Chrome is a city driver. "When you’re browsing the web with Chrome, like city miles, it wakes up, sprints to next stop light, stops and then sprints to next stop light. It's one way to get through city, but uses a lot of gas," he said.

According to Weber, Chrome and Firefox are constantly talking to the operating system. He described it as waking up roughly 60 times a second (and sometimes up to 250 times a second).Edge takes advantage of its deeper integration with the operating system to wake less frequently, he claimed.

"Edge never wakes itself up ... We tell the OS that. 'Hey, we have work to do and you [the OS] tell us when it’s most efficient to do that work," said Weber.

For example, when you touch the screen of your Windows 10 touchscreen computer, the hardware wakes up, sends a message to the web browser on screen. If it's Chrome, Chrome then tells Windows 10 it needs to animate the screen to scroll up or down. With Edge, you can scroll the web page without, Weber said, waking it up. The OS is already there, ready to do the graphical work.

How do you know?

Having hours of more battery life on your laptop because you chose Edge over Chrome sounds amazing. But why should we trust these claims? Testing technology battery life is notoriously difficult. You have to have multiple test beds, with vanilla set-ups, nothing extraneous running in the background that could impact battery consumption and perfectly repeatable test scripts. Compounding this is the challenge of testing web page battery consumption. Every page is different and most of what consumes power happens in the background.

Weber acknowledged the challenge, but told me Microsoft had figured out a few ways to accurately test Web browser battery consumption. The team combined lab tests, telemetry from millions of Edge users, and a run-down test that it captured on video.

The lab tests were particularity impressive. They included 200 PCs, a mini, in-lab Internet, systems connected to voltage meters, and special computers with power-measurement chips built right onto the motherboard.

The video showed particularly impressive power gains. At one point, the system running Microsoft Edge ran 70 percent longer than the one running Google Chrome.

Next Level

Whether or not you believe Microsoft, the company is already busy ramping up power efficiency for the next, big Microsoft Edge release, which will arrive as part of the Windows 10 Anniversary Update later this summer.

According to a technical blog post by Microsoft Edge Program Manager Brandon Heenan, the Anniversary Update will address JavaScript access in background tabs. Instead of allowing JavaScript to continuously run in hidden tabs, it will slow it down to running once per second.

They're also re-architecting how Edge handles animations by removing duplicate frames at the beginning and end of loops.

However, no change may be more welcome that what the team plans to do with Flash. The Anniversary Update will make Flash a separate process and the system will pause any unnecessary Flash operations. It will also stop Flash if it becomes unstable, without impacting the rest of the browser session.

Source:  http://mashable.com/2016/06/20/microsoft-edge-battery-life/#81O68D2GhOqt 

Categorized in Online Research

The recent Google Performance Summit inspired with smart new features coming in both Google Analytics and AdWords. Once again, Google makes the life of marketers and analysts easier by taking over mundane everyday tasks, leaving more time for us to soak into user and performance data.

The key message was that “mobile first” is not simply coming, it’s already here! If your product, marketing and data team is not ready for it, you might be too late to catch up. Now, the right question to ask now is how to invest in mobile and not if you should invest in mobile. Rethinking your marketing, ads, site and product so that it is more useful for mobile, is how you begin.

Partnering with Google for your ads and analytics is smart because they reach 90% of the internet population. Google is also able to track online behavior that results in offline purchase. It is all because of your phone. Your devices are connected. So if you’re online looking for the best pizza places for lunch in your area, view three of them and eat lunch in one of them, this is all trackable!

In terms of ads, Google’s cool new features include longer titles for text ads (almost twice the current limit). Titles turned out to be very useful for allowing mobile users to decide to click or not to click. For display ads, the ad is split in several parts: image, URL, headline and text.

Google puts it together for you, automatically optimizing the ad for the different devices. Google is allowing you to bid separately for mobile traffic, tablet traffic and laptop traffic. The best feature of them all is the reach to similar audiences. For example, if you know that 35-year-old women wearing, shoe size seven and watching the TV show, Silicon Valley, convert 10% better than the rest of your clients, your can now target them through ads. (Prepare for a higher targeted price, though).

In terms of Google Analytics the biggest news is the Google Data Studio, which is a tremendous help in report-building — actually I am cutting this post short as I my hands are itching to try it out and share my opinion on it, of course.

Other nice new features include the intelligent voice search option in Google Analytics (premium) accounts, where you can simply ask: “Hey Google, which are my best converting channels in May?” The reports appear in front of you. The seamless integration between all Google tools, plus the improved collaboration and share options, are the way to go!

Hopefully all these new features will help users not only collect more data, but also make use of it, analyze it and make intelligent business decisions based on those findings. To watch the recording of the #GoogleSummit, click here. 

Source:  https://medium.com/power-to-fly/marketers-heres-how-google-s-new-analytics-tool-will-make-your-life-easier-6d126483f22a#.pxwgoid1p

 

Categorized in Search Engine

If you have heard the term ‘RankBrain’ before, you will know that it is ‘the third most important ranking signal for Google’, as this tends to be the most commonly cited fact based on what we know (the facts as opposed to the opinion) about this Google update. You may also know there is some debate, but many experts believe RankBrain is likely to increase in influence over the months to come.

This post explores how to optimize your content for RankBrain and helps you to make the most out of this exciting search and business opportunity.

What Do You Need to Know About RankBrain?

In October 2015 Forbes and Bloomberg were among the first sites to announce that machines were “taking over” at Google. As you might expect, the title, while factually correct, led speculation into the realm of ‘robots taking over’ rather than the real opportunity, which is; ‘How does RankBrain impact my business?‘ and more importantly ‘How can I update my content for RankBrain?‘

If you are keen to explore machine learning, the debate of ‘robots taking over’ and associated topics like Artificial Intelligence (AI), the below are some reads that you may find useful:

Detailed machine learning insight direct from Google – Machine intelligence resource

Robots taking over insight from a post I wrote soon after RankBrain was first released – Good news, machines have taken over at Google
Adding to the Artificial Intelligence (AI) debate and search – How AI is transforming Google

What is RankBrain?

Optimizing Your Content for RankBrain

RankBrain is a form of Artificial Intelligence used by Google to help filter and process a large portion of search queries, so the results displayed are relevant to, and reflective of, the search intent. RankBrain uses machine learning and AI, with the ability to predict meaning, and therefore relevancy, to display the best matching results, even for previously unknown, and new search requests.

RankBrain has a core function of effectively answering new search queries and understanding what results should appear for topics with limited, if any, historical data to base relevancy and therefore ranking on.

The remainder of this post practically explores how to make the most of RankBrain and provides actions you can take to optimize your content for search gains.

Understanding Content

RankBrain looks at new and relatively unknown content topics with a goal of understanding them more effectively so it can display the best results. The more your content supports this understanding, the better. Tactics to deploy will depend on several factors. However, typical SEO actions to encourage content understanding include the following:

Clear purpose: Make it easy for search engines and users to identify why the content was created
Depth of information: Cover the topic in full and providing access to external supporting information
Topic focus: Move away from very refined sets of specific keywords (keyword focus), and cover the variations and natural language used to convey the content meaning spanning target audiences
Term frequency–inverse document frequency (TF-IDF): How frequently you use terms within the content and the perceived importance of those terms to that content – this is not keyword stuffing!

Embracing Machine Learning

Content marketers and search engine optimization experts need to start to see robots/machines as a colleague rather than the competition. Data should be the fuel behind the content you create and reflect your audience wants and needs, as well as the data opportunity that exists. RankBrain learns (remember the AI aspect of this topic) through data (machine learning) and the more your content creation approach is driven by a combination of data and expert insight, the easier it will become to align content output with RankBrain outcomes.

Do not write for RankBrain!

Effectively using data for content insights and ensuring content is easy to understand are core steps in optimizing for RankBrain. Do not start viewing RankBrain as a persona to target. Instead, by taking the right approach you will naturally fulfill the needs of RankBrain.

Refining Your Existing Content

Content should never have just a single chance of succeeding online. When you add content to your website, that is the first step in the process. This enables you to gather new data relating to how that content performed, how people engaged with it, what the content displayed for, whether the content was successful, and associated #WhatDidTheContentDo type questions.

When you take the next steps to analyze the data, refine the content and continue this process of content improvement, you are applying an element of human/data collaboration into content creation that will help maximize any RankBrain interaction with your website.

The Value of Voice Search

Optimizing Your Content for RankBrain

When you create content for your website consider how people will naturally speak about the content – more specifically, how they would search for it using voice commands. RankBrain is in place to help display the best results for search behavior that has limited current data to base decisions on, and vague (less interpretable) user requests—this is where voice search comes in!

People have had less training when it comes to using voice search, which leads to more complex and varied interpretive requirements for search engines. By reflecting more on the audience your content is created for, you can build user targeting and voice query matching into your content

Don’t Forget About Links

Every aspect of content relevancy, trust, authority and value must be considered for creating any competitive advantage from Google RankBrain. With hundreds of Google ranking factors, many tied directly to content, the application of the tried and tested content optimization tactics are a necessity. When you consider the importance of search ranking factors, please do not overlook links.

Links will help provide external relevancy and third-party trust to your content, and provide independent complementary signals for RankBrain to include as part of the broader approach to displaying your content effectively.

Keep the following in mind the following when pointing links to your content:

Domain Authority matters
Quality takes precedent over quantity in the long-term
Relevance to the topic is paramount
Think about the audience before the desired link outcome
Final Thoughts on Content for RankBrain
RankBrain gives content writers and marketers an added incentive to get to know their target audience, refine the content being created, and ensure that content has depth of value and a defined purpose for being produced in the first place.

When you consider how to optimize for RankBrain a good place to start is ‘who the content is for‘ and ‘why it matters‘. There are lots of tactics to use to support RankBrain known and believed information triggers, however, this should be seen as a secondary activity to the contents actual purpose.

I love to hear your thoughts and feedback, so please share them with me and tell me what you think about RankBrain and the tactics you have used that have delivered results.

Source:  https://www.searchenginejournal.com/optimize-content-google-rankbrain/163401/

Categorized in Search Engine

You may be using Google Analytics, but are you using it to its full potential? Contributor Khalid Saleh lays out 7 key reports with which every marketer should be familiar.

For marketers, there are few skills more important than a deep understanding of Google Analytics and its conversion measurement capabilities.

After all, this is the tool that tells you whether your efforts are actually translating into results.

Unfortunately, mastering Google Analytics can be challenging, even for experienced marketers. There is far too much data and too few easy-to-follow dashboards to sort it out.

To help you out, I’ve put together a list of seven custom and standard reports you can use right away to get better insight into your marketing performance.

1. Mobile Performance Report

You know this already: Ours is a mobile-first world. The total number of mobile users now exceeds the total number of desktop users…

2016-05-27_11-08-37

… and mobile e-commerce is nearly 30 percent of all e-commerce in the US.

2

In fact, mobile is so important now that Google even penalizes websites that are not mobile-friendly.

For marketers, knowing how their sites perform on smaller screens is vital to staying alive in the SERPs and winning over customers.

The mobile performance report shows you how well your site (not app) is optimized for mobile and where you need to make improvements.

You can even segment the report further to see which mobile devices/browsers customers are using to access your site. This will tell you if your site is performing poorly on some devices.

Accessing this report is easy: Just go to Audience -> Mobile -> Overview.

3

This will show you how your site does on different platforms:

4

You can add more dimensions here as you see fit. Take careful note of bounce rate, time on site and page views to see whether your user experience is failing on one or more mobile channels.

2. Traffic Acquisition Report

Want to know if people are actually clicking on your ads? That guest post you published earlier — is it generating any traffic to your website? How about your SEO strategy? Is it actually working?

The traffic acquisition report will tell you all this and more. For many marketers, this will be their first step in the reporting process.

This is a standard report, so you can find it by going to Acquisition -> Overview.

5

This will give you a quick breakdown of your traffic sources.

6

Of particular insight here is the “Referrals” tab (Acquisition -> Overview -> All Traffic -> Referrals). This will tell you which external sites are driving traffic to your site.

7

Clicking on a referring website will show you the exact pages visitors used to enter your site.

8

3. Content Efficiency Report


Do you generate a lot of content on your website and find that tracking it is getting a little overwhelming?

Avinash Kaushik, author of Web Analytics 2.0 and a Digital Marketing Evangelist at Google, created this report to solve this exact problem.

This report tracks entrances, page views, bounces and goal completions to help you answer questions like:

Which content is engaging your audience the most?
What type of content (images, videos, GIFs, infographics, reviews) performs best with your readers?
Which content converts readers into customers?
Which content is shared most by your users?
Here’s a quick overview from Avinash himself:

9

You can get a more detailed explanation of the report here. To grab a copy for yourself, check this link (you’ll need to log into Google Analytics first).

4. Keyword Analysis Report

Getting organic traffic from Google is great. Unfortunately, ever since Google started encrypting search data in 2012, your organic traffic keyword report has mostly shown this:

10

However, you can still gain a ton of insight about your visitors by tracking the performance of unencrypted keywords.

This report created by eConsultancy analyzes the most popular (and available) incoming keywords to your site. It shows visitor metrics, conversion rates, goal completions and page load time for each keyword.

11

Use this data to figure out what keywords are working best for you, how many of them are actually contributing to your goals and what keywords you need to optimize for in the future.

5. New vs. Returning Visitors

Getting a user to come to your site for the first time is great. Getting them to visit again is even better. After all, it is the returning visitors who usually end up becoming readers, followers and customers.

This standard report in Google Analytics will tell you what percentage of your users are coming back to your site.

You can find it by going to Audience -> Behavior -> New vs. Returning in your Analytics account.

12

Usually, the metrics for new and returning visitors are quite different. Returning visitors tend to stick around longer and have lower bounce rates.

13

6. Landing Pages Report

Your users will enter your site from all sorts of pages. Some will type in your home page URL directly, some will find a page through search engines, and some others will click on a link shared on your Twitter feed.

This report will tell you which pages visitors are landing on when they first enter your site. Based on data from this report, you can figure out how users are interacting with your site.

For example, if the report shows that some pages have a substantially higher bounce rate than others, you can take steps to make high bounce rate pages more engaging.

Find the report – Behavior -> Site Content -> Landing Pages.

14

7. Bounce Rate vs. Exit Rate Report

“Bounce Rate” is the percentage of visitors who don’t take any action and leave from the same page they landed on.

“Exit Rate” measures the percentage of your visitors who browse more than one page on your site before leaving.

This report compares the bounce rate vs. exit rate for different pages on your site.

You can find it by going to Behavior -> Site Content -> All Pages:

15

Next, select “Bounce Rate” and “% Exit” in the Explorer tab.

16

This will give you a visual comparison between bounce and exit rate for all your pages. You can drill down further to get this data for each page.

1

Use this report to find pages with low engagement and detect UX problems on your site. For example, if visitors are exiting a three-page article after reading only the first two pages, there’s probably something that is causing them to leave on the second page (too many ads, bad copy, a distracting link in the sidebar and so on).

Over to you

Google Analytics is an essential analytics tool for any marketer, but making the most of it can be challenging. By using a mixture of pre-created custom reports and standard reports, you can gain valuable insight into your users.

Google Analytics’ Solutions Gallery is particularly useful for someone new to analytics. Here, you can import expert-created reports into your Analytics account to build powerful dashboards quickly. You can also use these reports as guides to help you understand this incredible tool better.

Source:  http://searchengineland.com/7-essential-google-analytics-reports-every-marketer-must-know-250412 

Categorized in Business Research

We do not possess the ability to read the future, and yet we can predict with a high level of certainty that we will see more major cybersecurity incidents in 2016 and 2017.

The world’s cybersecurity capability is not able to advance in line with the growing vulnerabilities. We are faced by more and more threats each day, and hackers are becoming more sophisticated. Whether an organization invests $1 million or $100 million in its security infrastructure, it will still remain vulnerable. What’s worse, there appears no end to this disparity.

Emerging security solutions, great as they may be, do not change the overall way of things; the Internet favors the attacker. Amazing entrepreneurs, as well as established companies, are creating solutions that implement better anomaly detection, better network segregation, better user identification and better leakage prevention. However, these are simply stepping stones, without the necessary leap forward that is required for a long-term solution.

At the same time, the cost of securing businesses from cyberattacks is constantly increasing. This is compounded by old technologies not being replaced by new technologies. Instead, new technologies are being added to already crowded security infrastructures. Unless this changes, there may come a day in which it is no longer deemed cost-effective, business-wise, to introduce new services on the Internet.

Incremental security changes will not work. We need disruptive innovation in the world of cybersecurity. A paradigm shift — something that will change dramatically the way things work. We want a solution that will have a significant positive effect, similar to the one created by the invention of the car, smartphone or time travel.

I am going to discuss one such solution now — creating a new, much more secure Internet that will dramatically improve cyber resilience and, at the same time, dramatically reduce expenditures on cybersecurity. Welcome to the world of AGNs (Alternative Global Network). To understand the concept of AGNs, we must go back to 1969.

In the beginning

In 1969, the same year that Neil Armstrong became the first man to step on the moon and the Beatles released their last album, Abby Road, a first packet was transmitted over a small network named the “Advanced Research Projects Agency Network,” also known as the ARPANET.

Trust was not something to be concerned about in this small and controlled network. Trust existed in the ARPANET because there was trust in the real world. The different users knew each other and the few connected devices were all controlled by the creators of the network. Risks such as fraud, hacking, malware, denial of service attacks and others were, to say the least, extremely improbable.

As time went by, the ARPANET expanded and became the technical foundation for the Internet as we know it.

So what do we have today? Billions of users, who don’t know each other and certainly do not trust one another, connecting through all sorts of devices (we have no clue what is connected to the Internet) and using the network in any way they deem fit.

Trust has become a challenge.

The Internet

When the ARPANET project began, no one expected that it would become such a huge success. In these essential early stages, it was not designed with security in mind, but rather to ensure connectivity. And yet, in a very short time, the ARPANET grew from a small research network to the huge global network that we all use today.

Many of the modern security challenges that we experience should be attributed to the fact that the Internet is not secured-by-design. It should be agreed that given the opportunity, we would definitely redesign it.

And to make things worse, much worse, the way the Internet was implemented prevents us from upgrading it to a more secure version. Let me explain what I mean when I say that the Internet cannot be upgraded.

We see a lot of innovation on the Internet. We see amazing new applications using new types of innovative protocols, like Voice over IP and video tunneling — things that no one imagined when the Internet started.

Nevertheless, none of those innovative applications are improving the core way the Internet works. We have been using the same problematic TCP/IP stack (more or less) over the past few decades, with zero probability that it will be replaced in the years to come.

Why? To upgrade the Internet, we actually would have to upgrade all the routers, switches and other connected network devices. And that is impossible to achieve because the network devices are mostly embedded systems that are bundled with hardware. They do not have standard interfaces and only the manufacturer controls the software, which means there is no way to do it remotely. We would have to access and upgrade each and every device.

Even with IPv6 we have failed. IPv6 is still not widely implemented, even though the IETF published its RFC in 1998 and everybody agreed about its importance. Google’s statistics show that only about 10 percent of the users who access Google services are doing so while using IPv6.

And much like any other place in which innovation has taken a backseat, we see so many problems with networking technologies today: they are hard to manage, inefficient, unreliable, costly, prone to manipulations and the list goes on.

Billions of new devices will be connected to the Internet in the coming years (according to Gartner). At the same time, as we have discussed, cybersecurity threats will dramatically increase. Therefore, we have an immediate need for a more efficient, secure, trustworthy and innovation-friendly (upgradeable) Internet.

AGNs (next-generation Internet)

Though upgrading the current Internet is an unfeasible task, there might be another way.

Wireless connectivity technologies of all kinds (Wi-Fi, satellites, cellular, etc.) have vastly improved in recent years. And soon they will reach a point where commercial companies, by using a small number of network devices, could implement worldwide networks that will allow Internet access from everywhere, by anyone and at any time.

Two great examples of companies that are currently working on bringing wireless Internet connectivity solutions to places around the globe that do not have traditional access are Google and Facebook — Google with activities like Project Loon, in which they are planning to use high-altitude balloons, and Facebook with activities like Internet.org that propose the use of solar-powered drones.

Though daring, a worldwide wireless Internet is inevitable. It simply makes more sense than spending trillions on upgrading super-costly physical infrastructures.

And herein lies the opportunity.

A “worldwide wireless Internet access solution” will allow us to implement a new way of networking, instead of using the traditional TCP/IP Stack based network. This network will not necessarily be IP-based, but rather be built upon a new connectivity model — more secure, simpler to manage and more efficient.

Let’s call this non-TCP/IP global network AGN: Alternative Global Network.

Cybersecurity and AGN

AGNs will introduce numerous opportunities (as well as numerous challenges) — far too many to discuss here. Hence, I will write about three disruptive benefits that represent a paradigm shift in the world of cybersecurity that will be created by AGNs.

One: No need for new security tools

In the world of cybersecurity as we know it today, every new problem (or family of problems) leads to the creation of a new family of products. New attack vector = new security tools. This is why, while trying to keep up with emerging threats, we continue to buy new security products.

As previously mentioned, those new emerging solutions represent incremental improvements in cybersecurity. They retain the status quo, rarely addressing the underlying problem, and do not create the changes necessary to overcome the threat of hackers. AGNs will radically change our current approach toward cybersecurity, rebalancing the power divide between the Internet as a force of good and those seeking to undermine it.

The AGN architecture design should allow the AGN provider to upgrade the network operating system and protocol stack both quickly and simply. Obviously, this creates new innovative opportunities, and will also have a tremendous effect on cybersecurity. Here are some examples:

A malicious entity seeks to exploit the way an AGN protocol works in order to facilitate a denial of service attack (much like what we see today). In that case, the moment the first attack has occurred and been analyzed, the AGN provider can update the entire network in a matter of seconds, to prevent the same attack scenario from recurring. This removes the need for every organization to buy a new cycle of products, saving billions on cybersecurity expenses worldwide.

Someone finds a bug in a tunneling protocol that enables them to gain access to what was otherwise restricted data. Again, a simple update (network security patch) and it is fixed.
A new secure GPS-aware packet transportation protocol is needed to support autonomous cars and drones. No problem, come back tomorrow and it will be ready.

The ability to mitigate security risks and create new network services breaks the paradigm of new security risks = procurement of a new set of security tools. Through this, one of the biggest challenges facing cybersecurity today can be solved.

Two: Network virtualization

AGN benefits can include, among many others, all of the benefits that software-defined networking (SDN) aim to introduce, but on a global scale. Benefits such as cost reduction, software-defined packet forwarding, central management and many others. If you are not familiar with SDNs, I urge you to learn more about the concept.

One of the most important benefits of SDN, which will also become one of the most important benefits of an AGN, is what is known as simplified virtual management. Though virtual management is already implemented in some organizations (through SDNs), in a global network its benefits are leveraged and ultimately augmented.

Virtualization in networking will have a similar effect to the one virtualization has in computing, i.e. completely revolutionizing the paradigm of the existing coupling between hardware and software.

Virtualization means the ability to simulate a hardware platform, such as network devices, in software. All of the device’s functionality is simulated by the software, with the ability to operate like a hardware-device solution would.

With network virtualization, any network architecture can be defined for any given set of devices, while completely ignoring the physical aspects of how those devices actually connect to the network. For example, your “home” network could contain your computer, laptop, mobile phone, car and all of your family member’s devices, with no regard to where they are in the world and without the need to implement any type of VPN solution.

Because the allocation of a device to a network is determined by soft switches (application-based switches), you can sit at the other side of the world and still be connected seamlessly to your home network. This is possible because the network architecture is defined by software rather than physical hardware (as opposed to today, where connections to your home network are only possible if you are connected to your home router).

You might be able to define any type of network architecture just by drawing and setting it up on a graphical dashboard. Alternatively, you might be able to combine any type of security solution in your network by using simple drag-and-drop gestures. Those tools can include firewalls, IDSs, IPSs, network recording, Anti-DDoS, etc., all of which are virtual appliances.

The virtualization of networking will also simplify implementing security tools. If a CISO suspects that someone is already inside his network, and thus he wants to implement a new network inspection solution for a short time, he will just have to add it to the dashboard and, with a click of a button, make all the traffic in the network flow through the new device. No need to define complex routing settings. No need to change vLan ACLs nor firewalls’ rules. Those of us who have faced these problems with traditional networks will really appreciate the change.

But for this to fully work, we also will have to change the way we think about networks. No more LANs and WANs. Anyone who wants to benefit from the network virtualization features will have to live by the principle of “every device is connected directly to the AGN” and the AGN will define logical separation to networks.

Three: Identified by default

The source of many problems we experience with the Internet today can be attributed to the fact that we are trying to supply services that require user identification on a network in which users are anonymous by default.

The same network is being used for e-banking services and drug purchasing, viewing medical results and child pornography, social networking and promoting terrorism.

The AGN provider will be able to implement an identified-by-default network. In this solution, the AGN will authenticate users whenever they are starting to use the network and be able to supply this identity as a service to any application that requires it. In that case, a user might even be able to access his bank without the need to type in a username or password.

The federated identity approach is already being serviced by companies such as Facebook and Google. Federated identity means that the user’s single identity is being used by different identity management systems.

But not only will users be identified, the hardware devices, or rather the network interfaces, can also be controlled to improve security and trust in the network.

How can that be achieved?

To connect to an AGN, one must buy a new type of Network Interface Controller (NIC) that supports the AGN protocols stack (obviously, current TCP/IP NICs will not work with AGNs). A wise designing of such an NIC will create a remotely programmable/upgradeable firmware (to support the AGN provider’s ability to upgrade the AGN quickly and remotely). The NIC will also hold a unique private key (NICPK). This key will facilitate tunneling between devices, as well as functioning as a type of license to use the AGN.

Based on those NICPKs, stored in all the NICs connected to the AGN, the AGN provider will have the ability to create some kind of Network Access Prevention (NAP) solution that will prevent any unidentified and authorized NIC from communicating within the AGN. Also, device to network allocations will be determined based on the devices’ NICPK. For example, a CIO might define a whitelist of NICPKs that are allowed to access internal resources.

And probably the most important feature of using NICPKs is increasing users’ accountability. In the Internet, as we know it today, it is very hard to exercise accountability. Hackers and other malicious entities are getting away with almost anything. The AGN provider will change this, and monitor activities across the entire network. The provider can identify any activity that is not aligned with the network code of conduct and exercise the appropriate sanctions on the user and the device.

For example, if a user created a phishing attack, he will be banned from the AGN network (his account will be disabled and his NICPK will be removed from the whitelist of allowed devices). If a user used torrents to download movies illegally, he will be banned from accessing the AGN for a week. If somebody instigated a DDoS attack using many zombie computers (infected computers that are being remotely controlled by a hacker without the users’ knowledge and consent), the AGN provider will prevent those computers from accessing the network until the virus is removed.

Another feature of an identified-by-default network is the ability of the AGN provider to control which protocols and which websites are allowed. This gives the AGN provider the freedom to decide whether torrents will be allowed, and whether people are allowed to use TOR-like services. One might think that by creating protocol encapsulation, users can override the AGN provider restrictions, and eventually create things like an AGN-based darknet.

But this is not as easy as it might sound, for two major reasons: (A) centralized network management allows relatively easy deep protocol inspection, and (B) the moment the AGN provider learns about this new service, he will be able to completely eliminate it in a very short space of time, thus not allowing any unauthorized services enough time to grow.

Moving to an identified-by-design network with a centralized control and high level of accountability is a paradigm shift from the uncontrolled and decentralized Internet that we have today.

What will happen to the “old” Internet?

We can expect AGN providers to create native services that can only be accessed by the AGN users, and AGNs might eventually even completely replace the old TCP/IP-based Internet. Nevertheless, in the meantime, it is obvious that no one will use AGNs unless access to the servers and services on the “Internet 1.0” will be enabled and seamless.

For that to happen, the AGN provider will have to implement a secure gateway. This gateway will be in charge of protocol translation (by stripping and reconstructing or encapsulation) and safe pass. Creating an AGN <-> TCP/IP (or Internet 2.0 to Internet 1.0) gateway, while retaining a high level of security in the AGN, is one of the biggest challenges. AGN providers will have to endure to create an alternative Internet.

Conclusion

It is becoming harder and harder to secure digital assets. We need disruptive solutions that will create a shift in the balance of things — providing a vital lead over malicious factors. Not only can AGNs do that, but they can also completely alter our approach toward cybersecurity.

Some might be concerned about the loss of privacy in an AGN world — and they would be right to be worried. An AGN provider will have infinite power over its user. But the fact that he can, doesn’t necessarily mean that he will.

Many times privacy and security are opposite forces, and balancing between them is more an art than science. Sadly, the same goes for privacy and monetization. Nevertheless, if designed right, AGNs can have a real, positive impact on the world of technology, while making the users feel comfortable and secure.

Implementation, however, will require a very responsible and privacy-aware AGN provider — one that will not misuse their power. Finding a balance between security and privacy, between centralized control and open network, between monetization and fair use, are all challenges that we will have to face on the way to creating a secure AGN.

Source: http://techcrunch.com/2016/03/13/building-a-brand-new-internet/

Categorized in Science & Tech

It has been two years since the Court of Justice of the European Union established the “Right to be forgotten” (RTBF). Reputation VIP subsequently launched Forget.me as one way for consumers in Europe to submit RTBF requests to Bing and Google.

The company has periodically used consumer submissions through the site (130,000 URLs) to compile and publish aggregate data on RTBF trends. A new report looks at two years’ worth of cumulative data on the nature, geographic location and success rates of RTBF requests.

The top three countries from which RTBF requests originate are Germany, the UK and France. In fact, more than half of all requests have come from Germany and the UK.

 

 

RTBF top countries

 

 

Google refuses roughly 70 percent to 75 percent of requests, according to the data. The chart below reflects the most common categories or justifications for URL removal requests, on the left. On the right are the reasons that Google typically denies RTBF requests.

Google most frequently denies removal requests that concern professional activity. Following that, Google often denies requests where the individual involved is the source of the content sought to be removed.

 

 

RTBF data Reputation VIP

 

 

The following list shows the breakdown of URLs submitted by site category. Accordingly, Europeans request more link removals from social sites than any other category. That’s followed by directories, blogs and so on:

Social networks/communities
Directories/Content aggregators
Blogs
Press sites
Wikipedia
Others (real estate, e-commerce, adverts, events, etc.)
By comparison, the links that are actually removed are more often from directories (not clearly defined here) than other site categories. Social site link removals are granted much less often than they’re requested.

 

 

RTBF removals Reputation VIP

 

Reputation VIP also points out in the report that Google’s processing time has improved in the two years since RTBF was announced. It has cut time from 49 days per request to 20 days (or less), according to the report.

 

Source:  http://searchengineland.com/report-2-years-75-percent-right-forgotten-asks-denied-google-249424

 

 

 

Categorized in Search Engine

Today the use of internet has increased to the extent that every individual and company is employing it to further their business goals. The internet is not only used for research but also for online trading of goods and services, which is a clear indication of the heavy dependence of companies on the internet for their revenue. With this increased reliance, the companies have the need to gain a better understanding of the activities on their websites. For this purpose tools such as ‘Web Analytics’ becomes important.

Web analytics is used to optimize the use of internet and to gain insight into the activity that customers engage in, when they visit a website. It keeps track of the number of visitors, and the amount of time they spent on the website while identifying the new visitors and returning visitors, for marketing purposes. This becomes useful when developing business strategy, especially for those businesses that are increasingly relying on the web for their revenue.

While Web Analytics present a variety of possibilities, and statistics, most of it might be irrelevant. Hence one of the first things you should do is to identify your key stakeholders, and what they are interested in. Recognizing the goals of the board of directors and the company will save you time and trouble of organizing irrelevant data. Also identify the valued customer, i.e. what is preferred most, whether a customer who visits most often or a visitor who stays longer? You should try to maximize the visitor’s experience, by understanding what he seeks when he visits your website.

Web Analytic Basics

After outlining the principle goals, identify the critical metrics needed. What are the signals that indicate the utility of a customer, and where do the gaps exist? How can we encourage consumers to sign up or make a purchase? After such metrics are identified, your task becomes relatively simple. You don’t need to go through the vast amounts of data and ratios. Instead you could do a targeted analysis of the metrics you have singled out, and prepare a presentation or report on the relevant statistics.

Understanding the basics of Web Analytics is the first task to utilizing it optimally. It may seem like a daunting task initially, but in reality there are just a few simple steps you need to follow.

You can start by taking a look at the basic figures given in the summary. The summary is available in any Analytics tool which includes the number of visits, bounce rate, average time on site etc. You should understand what they represent. See how they compare the recent trends, i.e. how the numbers of visitors, average number of pages visited and other statistics compared to last few weeks, or last month.

You should also have an understanding of where the traffic is generated from. Apart from direct traffic, i.e. the people who visit the website by typing in the URL, there are two major external sources that can guide traffic to your website; i.e referring URLs and search engines. You should take a look at the figures to see how much is contributed by the direct traffic; the people who know your website enough to know its URL, and from the external sources; the referring URLs.

Again the key is to look for trends and see which traffic generating source has grown the most, and which is lagging behind. You can try to identify keywords that become the source of most traffic generation, or keywords that direct traffic from Search Engines.

Since the homepage is not always the first page a visitor encounters when they visit your website, it is important that you see the individual bounce rates for each of the entry points. A visitor may click on a link to your website, and go deep into your website, instead of visiting the homepage first. You should identify the pages that are the top entry points, and analyse which of them are engaging enough for a visitor to want to browse more. Keep an eye out for pages with a high bounce rate, since those pages are not convincing the visitors to stay and browse more. This could be because those pages may not be fulfilling the consumer expectations.

After following these steps, by now, you should have narrowed down focus areas (pages and keywords) for you to concentrate on when tweaking your website to generate more activity and attract new and old visitors. Web Analytics is best employed when it is used for constant improvement of the customer experience on your website. However, from the range of Web Analytics tools available, choosing the best one might be a daunting task.

Types of Web Analytics

According to user reviews, Google Analytics seems to be the popular choice. It offers a sophisticated data integration free of cost, and hence is a great option for starters. Yahoo Web Analytics also follows closely. A bit of an upgrade from Google Analytics, it gives bit more insight into consumer behavior and demographics. Clicky is also undoubtedly one of the best, with an affordable price, and a more current report than Google and Yahoo. Clicky allows you to view what current users are doing on your website. Though many more options exist, the choice of the best tool depends on your requirement and budget.

Although Web Analytics is a breakthrough in conducting business online, use of the right criteria, data and tools is imperative to employ it successfully. Effective use of web analytics requires identifying the stakeholders, identifying the valued customers, and thereby identifying the relevant metrics. Also we should keep in mind our specific needs and budget while choosing the best tool available.

Summary:

To use web analytics, we should identify the stakeholders and the valued customers, and hence identify the relevant metrics. Also while choosing the best tool available we should keep in mind our specific needs and budget.

 

Categorized in Science & Tech

Quality content is a hard sell. Sure, people get that content is important — but getting people to invest the time and resources needed to make content great isn’t easy.

It’s not enough to tell decision makers you need quality content. In order to make the case for it, you have to demonstrate success and failure. Selling content strategy is a continuous process. You must show how content quality impacts business goals and user needs.

This is a tall order. As content strategist Melissa Rach says on the value of content: “Most people understand that content has value. Big value. They just can’t prove or measure the ROI [return on investment]. And, therefore, they have no concept of how much content is worth.”

So, how do we determine if content is good or bad? How do you know if it’s working as you’d hoped?
Content governance is not possible without content measurement. You can’t define content and resource needs without understanding the value and effectiveness of your content.

How Do You Measure Content Quality?

Fundamentally, there are two types of content measurement: quantitative and qualitative.

You can think of quantitative and qualitative as what vs. why. Quantitative analysis can tell you what users are doing — how they’re interacting with your content. Qualitative analysis can tell you why they are on your site — what their intent is and whether your content is communicating clearly.

Together, these two forms of analysis help paint a well-rounded picture of content value. It’s no good knowing what is happening if you don’t know why it’s happening. And it’s no good understanding why if you don’t know what got users there in the first place or what they’re doing.

When it comes to web analytics, I’m equally enthusiastic and cautious. Web analytics provides easy access to valuable insights — not just for content governance but also content planning. However, when used poorly, it can confuse and mislead rather than guide and inform.

In order to make good use of web analytics, you need to understand its strengths and weaknesses.


What Web Analytics Can’t Do

1. Provide a complete content measurement solution

It’s a common mistake to use web analytics as a default content assessment tool. Remember, it’s only one side of the content measurement equation . As content strategist Clare O’Brien says, organizations are overly obsessed with analytics data:

Broadly speaking — and thanks largely to the ubiquity and ease of access to Google Analytics (GA) — businesses have become fixated by traffic volumes, bounces, sources, journeys and subsequent destinations and the like and aren’t looking to learn more.

We have to think bigger when it comes to content assessment. On its own, web analytics can be misleading.

2. Provide accurate data

One of the reasons web analytics is so compelling for data nerds is that numbers appear definitive and actionable. But, in reality, no analytics tool provides completely accurate data. Different data collection methods, reporting errors, and user blocking information sharing compromise accuracy.

(But don’t worry — I’ll soon tell you why this inaccuracy is okay.)


3. Adequately answer why?

As I mentioned, web analytics can help us understand what users are doing and how they interact with our content. However, it can’t answer why they are interacting with our content.

Web analytics can’t adequately replace qualitative analysis or even a single user telling you why they visited your website and why they left.

What Web Analytics Can Do (And Why It’s Great)

1. Quantitatively evaluate web content quality

There are many definitions for web analytics, but the most clear and succinct I’ve found is on Wikipedia:
"Web analytics is the study of online behavior in order to improve it."
Indeed, that is the strength of web analytics. By understanding how people use your website, you’re empowered to discover and assess content problems that lead to positive change.

2. Comparative analysis: measure website trends

Stumped by the notion that web analytics can’t provide accurate data? As promised, fear not! The reason this is okay is because the power of web analytics lies in trends, not in individual numbers.

Without context, single metrics are meaningless. Knowing that you received 8,000 admissions website pageviews last month isn’t as important as knowing that those 8,000 pageviews are a 25 percent increase from the previous year. That’s progress.

3. Challenge and validate assumptions

We make assumptions every day about how people use our website and what information is most valuable. I’m unable to count the number of website redesigns I’ve witnessed that were guided by assumptions regarding content needs and user goals.

While some of these questions are best answered through a comprehensive content analysis, web analytics can help validate or disprove those costly assumptions.

4. Demonstrate how your website meets established business goals and users’ needs

As important as qualitative content analysis is, these findings rarely make the case for quality content on their own. People need concrete data to assess value.

It’s not enough to simply say that Sally doesn’t want to fill out your two-page inquiry form. It’s more effective to show that the inquiry form has an 80 percent abandonment rate. Gut instincts are good, but numbers are better.

5. Enable stakeholders and content owners to measure the success of their own content

As we know, content governance in higher ed is not a one-person job. It involves numerous departments, content owners and other stakeholders who are charged with making decisions about content. Unfortunately, most of these content stakeholders are not content experts or skilled at assessing content performance.

With planning, web analytics can provide content stakeholders with relevant web metrics to evaluate the success of their content. This is huge. I also find that the more people are aware of how their content is being used, the more likely they are to care about maintaining it. Win, win!

What Is Next?

This post kicks off a series of posts on web analytics and content assessment. I’d like to discuss how we can be smart about our use of web analytics and our approach to governance and measurement.

If there are analytics topics you’d like to see covered as part of this content measurement series, let me know. I’m taking requests!
Update 11/8/12: Check out the second post in this series on web analytics and content assessment, A Web Analytics Framework for Content Analysis. 

Source:
http://meetcontent.com/blog/web-analytics-what-is-it-good-for/ 

Categorized in Online Research
Page 1 of 2

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.
Please wait

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Newsletter Subscription

Receive Great tips via email, enter your email to Subscribe.
Please wait

Follow Us on Social Media

Book Your Seat for Webinar GET FREE REGISTRATION FOR MEMBERS ONLY      Register Now