Rene Meyer

Rene Meyer

While GPS tagged photos are handy for always knowing where you took a photo, location data embedded in photos does have unsettling privacy and security implications. Should you be worried about the risk of people tracking you down via photos you post online?

Dear How-To Geek,

You guys need to help me out. My mom forwarded me this news clip which (I presume) another one of her friends with equally over-protective grandmotherly traits forwarded to her. Essentially it’s a clip from an NBC news segment highlighting how easy it is to extract the location from a photo. My mom is freaking out insisting that I’m putting my kids at risk because I put photos of them on Facebook and some abductor is going to come climb in their window.

Is this news clip just scare mongering to get people to watch the 10 o’clock news or is it something I actually need to be worried about? I’d really like to calm my mom down (and more sure I’m not actually posting my personal data like that all over the web).

Sincerely,

Sorta Paranoid Now

Before we delve into the technical side of your issue, we feel compelled to address the social side. Yes, everyone is worried that something bad is going to happen to their kids (or grandkids), but realistically speaking, even if every photo we all posted online had our full home address printed right on the front like a watermark, the probability of anything bad happening to any of us (including our kids) is still nearly zero. The world just isn’t full of hordes of awful people we frequently allow ourselves to believe it is.

Even though the news does a good job making us feel like we live in a terrifying world filled with kid snatchers and stalkers, the actual crime stats paint a different story. Violent crime has been falling in the United States for decades and of the 800,000 or so missing children reported every year in the U.S., the vast majority of them are either teenage runaways or children taken by parents engaged in custody battles; only around 100 of them are your stereotypical stranger-snatches-child scenario.

That means stranger abductions account for only 0.000125% of all the under-18 missing person's cases in the U.S. and, based on Census data indicating approximately 74 million people aged birth-18 in the U.S., it means stranger abductions affect roughly 0.00000135% of the children. Yet no news producer has ever boosted their evening news rating by leading with “Tonight at 10, we’ll talk about how the chance of your child being abducted by a stranger is one hundred-thousandth of a percent higher than them getting struck by lightning!”

Now, while we hope you take the above information to heart,  we still understand that it’s good security practice to not put our personal information all over the web and to control who has access to the information we share; social side addressed, let’s look at the technical side of things and how you can control the flow of information.

Where Is The Location Data Stored?

Photos have EXIF (Exchangeable Image File Format) data. EXIF data is simply a standardized metadata set of non-visual data attached to photos; in analog terms think of it like the blank back of a photograph where you can write down information about the photo like the date, time, what camera you took the photo with, etc.

This data is, 99% of the time, extremely handy stuff. Thanks to the EXIF data, your photo organizer app (like Picasa or Lightroom) can tell you useful information about your photos such as shutter speed, focal length, whether or not the flash fired, etc. This information can be enormously useful if you’re learning photography and want to review what settings you used when taking specific photos.

It’s also the same data that allows for neat tricks like searching Flickr based on which camera took the photo and see what the most popular models are (as seen in the chart above). Professional photographers love EXIF data because it makes managing large photo collections significantly easier.

Some cameras and smartphones, but not all, can embed location data inside the EXIF. This is the 1% of the time where some people find the whole embedded EXIF data thing to be problematic. Sure it’s fun if you’re a professional photographer or serious hobbyist and you want to actively geotag your photos to appear on something like Flickr’s world map, but for most people the idea that the exact location (within 30 feet or so) where their photos were taken is linked to the photo is a little unsettling.

Here is where it pays to be aware of the capabilities of the equipment you’re shooting your photos with and to utilize tools to ensure that what your equipment is saying is happening, is actually happening.

How Do I Disable Geotagging?

The first step is to determine whether or not the camera you’re shooting with even embeds location data. Most stand-alone digital cameras, even expensive DSLRs, do not. GPS-tagging is still a new enough and novel enough technology that the cameras that feature it advertise it heavily. Nikon, for example, didn’t introduce a DSLR with built-in GPS tagging until October of 2013. DSLRs with geotagging remain so rare that most professionals who want it simply buy a small add-on device for their camera to provide it. GPS tagging is slightly more common in point-and-shoot cameras but still fairly rare. We recommend looking up the specific model camera you own and confirming whether or not it has GPS-tagging and how to disable it, if so.

Smartphones are, however, a completely different story. One of the big selling points for modern smartphones is the built-in GPS. That’s how your phone can give you accurate directions, tell you there is a Starbucks around the corner, and otherwise provide location-aware services. As such, it’s very common for photos taken with a smartphone to have embedded GPS-data because of the phones all ship with GPS chips right in them. Just because the phone has a GPS chip doesn’t mean you have to allow it to tag your photos.

If you’re sporting an iOS device, it’s easy to not only turn off geotagging but to limit which application can access location data on an application-by-application basis.

In iOS 7, navigate to Settings -> Privacy -> Location Services. There you’ll find a general Location Services toggle (which we recommend leaving on, as so many features of the iPhone/iPad rely on location), and then below it, as seen in the screenshot above, individual toggles for individual apps. If you toggle “Camera” off, then the camera will no longer have access to the location data and won’t embed it in the EXIF data of the photos.

For Android, there are two ways to approach the issue. You can go into the camera app itself and disable geotagging. The exact route to the setting varies based on the version of Android and the camera you have, but it’s typically (from within the camera app) Settings/Menu -> Location Icon (tap the icon to toggle the location services on or off):

The alternative method is similar to disabling Location Services on iOS. You can go into your phone’s general Settings -> Location Access and turn off “Access to my location”. Unfortunately, unlike iOS, in Android, it’s an all or nothing setting. Given how useful GPS data is for other applications (like Google Maps), we’d recommend sticking with toggling the geotagging from within the camera app.

How Can I Confirm The Photos Aren’t Geotagged?

It’s all well and good to adjust the settings in your camera or phone, but how can you be sure that your photos are actually free of GPS/location data? Smart geeks trust but verify. The easiest way to check without having to install any special software is to simply check the properties of the photo on your computer. We took two photos, one with geotagging turned on and with geotagging turned off, to demonstrate.

Here is what the geotagged photo looks like when the file properties are examined in Windows:

Here’s a photo taken moments later with the same camera, with geotagging toggled off:

The entire GPS data chunk is missing; the EXIF report jumps right from advanced camera data to basic file information.

Most photo organizers like Windows Live Photo Gallery, Picasa, Lightroom, even lightweight apps like Infranview (with a free plugin) will read EXIF metadata.

How Can I Remove Location Data?

If you’ve successfully turned off geotagging for future photos, you still have (assuming geotagging was previously enabled for your camera) all the old ones to deal with. If you plan on uploading or sharing older geotagged photos, it’s wise to strip the information out of them before sharing them.

You may have noticed, in the previous section, that the file property box in Windows has a little “Remove Properties and Personal Information” link at the bottom of the interface. If you’re planning on uploading photos, you can highlight all the photos you intend to upload, right click, select Properties, and then bulk strip the data using that “Remove Properties” link in the detailed file view.

You’ll be prompted with the following window:

Here you can opt to completely strip the files of their EXIF data; this first option will make a copy of the files with all the EXIF data removed. You can also keep the original files and selectively remove the metadata (this option permanently removes the selected data from the files with no backup copy). If you want to take advantage of the EXIF data reading in an application or online service, but you don’t want to share your location, you can select this option and strip out only the GPS data.

Unfortunately, there is no built-in easy EXIF data stripper in OSX or Linux. That said, ExifTool is a free cross-platform tool for Windows, OS X, and Linux that can batch process photos and modify/remove their EXIF data.

If all your geotagged photos are on your mobile device and you don’t want to put them all on your computer to work with them, there’s an additional option. PixelGarde is a free application available for both Windows and OS X as well as Android and iOS devices. Using the application it’s easy to strip EXIF data in bulk right from your device.


Ultimately, while the actual risk of harm befalling yourself or your family as a result of EXIF data is pretty small (especially if you’re only posting photos to social networks where you’re communicating with friends and family), it certainly doesn’t hurt to strip the data. It’s easy to turn the feature off in your camera or phone, it’s easy to remove it after the fact, and unless you’re a photographer who needs or wants to geotag photos for precision logging and display, most of us are content to stick with using our memories to recall the photos were in fact taken in our own backyard.

Have a pressing tech question? Shoot us an email at This email address is being protected from spambots. You need JavaScript enabled to view it. and we’ll do our best to answer it.

Source: This article was published howtogeek.com By Jason Fitzpatrick

Written with lawyers in mind, this book provides everything attorneys need to know about conducting online research.

A few weeks ago, I received a review copy of “The Cybersleuth’s Guide to the Internet,” written by Carole A. Levitt and Mark E. Rosch. This book, which was recently updated and is now in its 14th edition, was written to guide lawyers through the process of using the internet to conduct effective and free investigative and legal research.

There’s a wealth of information available online. For busy lawyers, locating a key piece of data can be instrumental to a client’s case. The trick is knowing where it can be found and how to access it. That’s where this book comes in. Written with lawyers in mind, it provides everything attorneys need to know about conducting online research.

At the outset, the authors delve into the ins and outs of the most popular search engines, explaining how to use them to locate information. They spend an entire chapter on Google — rightly so — but also cover Bing and DuckDuckGo.

Two of the most useful chapters cover locating people and public records online. In Chapter 9, the authors provide information on ways to use the internet to locate people and conduct background checks. In it, the authors explain how and why to research individuals online, noting that many of the most free, popular sites they’d recommended in the past are now fee-based.

According to the authors, one site that continues to be free and incredibly useful is Pipl, which is described in full. Also covered are the ins and outs of using Google effectively for this purpose, including search methods that yield relevant results.

In Chapter 10, the authors focus on websites that provide access to free public records and publicly available information. The websites discussed include:

In Chapter 12, you learn this interesting tip: you can use your public library card to gain access to expensive pay databases for free via your library’s online portal. To do so, you’ll need a library card (and possibly a pin number), and will then need to locate the databases on your library’s website. Databases available at some libraries include the full text of the Wall Street Journal and New York Times, Gale’s Business Directory (provides background information, broker reports, and more), and access to ReferenceUSA or AtoZDatabases, which include addresses and phone numbers for millions of people and businesses. In this chapter, the authors walk you through how to use many of these databases.

In subsequent chapters you’ll learn how to:

1) use court dockets as investigative tools;
2) locate liabilities, including bankruptcies, UCC filings, judgments, and liens;
3) locate assets;
4) access vital records;
5) navigate telephone and address directory websites;
6) locate criminal records;
7) find experts and verify their credentials; and
8) much more. This book even includes tips on searching social networking sites and locating images online.

Because of the mercurial nature of the internet, online content is constantly changing and is updated often. So, unfortunately, as soon as this book was published, some of its content was already outdated. But, that’s the nature of the beast and is to be expected when you’re covering internet tools.

Fortunately, the authors maintain updates on their website. So, for example, prior to the publication of the 14th edition, no changes had been made to PACER since 1988. However, since it was published a few months ago, a number of significant changes were made to PACER. Those updates, along with others that are periodically added, can be found here.

For even more tips and tricks from Carole and Mark, make sure to watch the video of this webinar from a few years ago where they share information on using the internet for legal research and investigative purposes.

Source: This article was published abovethelaw.com By Nicole Black

In Google’s earlier days, the search engine relied heavily on text data and backlinks in order to establish rankings through periodic refreshes (known as the Google Dance).

Since those days, Google search has become a sophisticated product with a plethora of algorithms designed to promote content and results that meet a user’s needs.

To a certain extent, a lot of SEO is a numbers game. We focus on:

  • Rankings.
  • Search volumes.
  • Organic traffic levels.
  • Onsite conversions.

That’s because these metrics are what we are typically judged by as SEO professionals. Clients want to rank higher and see their organic traffic increasing and, by association, leads and sales will also improve.

When we choose target keywords, there is the tendency and appeal to go after those with the highest search volumes, but much more important than the keyword’s search volume is the intent behind it.

This is a key part of the equation that is often overlooked when content is produced, it’s great that you want to rank for a specific term but the content has to not only be relevant but also satisfy the user intent.

This article will explain not only the different categorizations of search intent, but also how intent relates to the content we produce, and how the search engines deal with intent.

The Science Behind Intent

In 2006, a study conducted by the University of Hong Kong found that at a primary level, search intent can be segmented into two search goals. That a user is specifically looking to the find information relating to the keyword(s) they have used, or that they are looking for more general information about a topic.

A further generalization can be made, and intentions can be split into how specific the searcher is, and how exhaustive the searcher is.

Specific users have a narrow search intent and don’t deviate from this, whereas an exhaustive user may have a wider scope around a specific topic or topics.

The search engines are also making strides in understanding both search intent, Google’s Hummingbird and Yandex’s Korolyov are just two examples of these.

Google & Search Intent

There have been a lot of studies conducted into understanding the intent behind a query, and this is reflected by the types of results that Google displays.

Google’s Paul Haahr gave a great presentation in 2016 looking at how Google returns results from a ranking engineer’s perspective. The same “highly meets” scale can be found in the Google Search Quality Rating Guidelines.

In the presentation, Haahr explains basic theories on how if a user is searching for a specific store (e.g., Walmart), they are most likely to be looking for their nearest Walmart store, not the brand’s head office in Arkansas.

The Search Quality Rating Guidelines echo this. Section 3 of the guidelines details the “Needs Met Rating Guidelines” and how to use them for content.

The scale ranges from Fully Meets (FullyM) to Fails to Meet (FailsM) and has flags for whether or not the content is porn, foreign language, not loading, or is upsetting/offensive.

Needs Met Slider

The raters are not only critical of the websites they display in web results but also the special content result blocks (SCRB), aka Rich Snippets, and other search features that appear in addition to the “10 blue links”.

Special Content Result Block

One of the more interesting sections of these guidelines is 13.2.2, titled: Examples of Queries that Cannot Have Fully Meets Results.

Within this section, Google details that “Ambiguous queries without a clear user intent or dominant interpretation” cannot achieve a Fully Meets rating.

The example given is the query [ADA], which could be either the American Diabetes Association, American Dental Association, or a programming language devised in 1980. As there is no dominant interpretation of the internet or the query, no definitive answer can be given.

Queries with Multiple Meanings

Due to the diversity of language, many queries have more than one meaning – for example, [Apple] can either be a consumer electrical goods brand or a fruit.

Google handles this issue by classifying the query by its interpretation. The interpretation of the query can then be used to define intent. Query interpretations are classified into the following three areas:

Dominant Interpretations

The dominant interpretation is what most users mean when they search a specific query. Google search raters are told explicitly that the dominant interpretation should be clear, even more so after further online research.

Common Interpretations

Any given query can have multiple common interpretations. The example given by Google in their guidelines is [mercury] – which can mean either the planet or the element.

In this instance, Google can’t provide a result that Fully Meets a user’s search intent but instead, produces results varying in both interpretation and intent (to cover all bases).

Minor Interpretations

A lot of queries will also have less common interpretations, and these can often be locale dependent.

Do – Know – Go

Do Know, Go is a concept that search queries can be segmented into three categories: Do, Know, and Go. These classifications then to an extent determine the type of results that Google delivers to its users.

Do (Transactional Queries)

When a user performs a “do” query, they are looking to achieve a specific action, such as purchasing a specific product or booking a service. These are important to e-commerce websites for example, where a user may be looking for a specific brand or item.

Device action queries are also a form of doing the query and are becoming more and more important given how we interact with our smartphones and other technologies.

Ten years ago, Apple launched the first iPhone, which changed our relationship with our handheld devices.

The smartphone meant more than just a phone, it opened our access to the internet on our terms. Obviously, before the iPhone, we had 1g, 2g, and WAP – but it was really 3g that emerged around 2003 and the birth of widgets and apps that changed our behaviors.

Device Action Queries & Mobile Search

Mobile search surpassed desktop search globally in May 2015 in the greater majority of verticals. In fact, a recent study indicates that 57 percent of traffic comes from mobile and tablet devices.

Google has also moved with the times – the two mobile-friendly updates and the impending mobile-first index being obvious indicators of this.

Increased internet accessibility also means that we are able to perform searches more frequently based on real-time events.

As a result, Google is currently estimating that 15 percent of the queries it’s handling on a daily basis are new and have never been seen before. This is in part due to the new accessibility that the world has and the increasing smartphone and internet penetration rates being seen globally.

According to comScore, mobile is gaining increasing ground not only in how we search but in how we interact with the online sphere. In a number of countries, including the United States, United Kingdom, Brazil, Canada, China, and India, more than 60 percent of our time spent online is through a mobile device.

One key understanding of mobile search is that users may not also satisfy their query via this device.

In my experience, working across a number of verticals, a lot of mobile search queries tend to be more focused on research and informational, moving to desktop or tablet at a later date to complete a purchase.

According to Google’s Search Quality Rating Guidelines:

Because mobile phones can be difficult to use, SCRBs can help mobile phone users accomplish their tasks very quickly, especially for certain Know Simple, Visit­ in ­Person, and Do queries

Mobile is also a big part of Google Search Quality Guidelines, with the entirety of section two dedicated to it.

Why Voice Search Is Important to Mobile Search

Also in section two of the guidelines is an important sentence, which helps us understand the relationship that Google sees between mobile and voice search.

If you are not familiar with voice commands, device actions, or phone features, please take some time to experiment on a mobile smartphone. For example, you can try some of these voice commands…

Virtual assistants have evolved since the Microsoft Paperclip.

Talking to your phone or small device in the corner of your room is fast becoming the norm.

This evolution has come hand in hand with the increase in smartphone penetration and technologies, as Echo and Google Home devices fill our homes.

Know (Informational Queries)

A “know” query is an informational query, where the user is wanting to learn about a particular subject.

Know queries are closely linked to micro-moments.

In September 2015, Google released a guide to micro-moments, which are happening due to increased smartphone penetration and internet accessibility.

Micro-moments occur when a user needs to satisfy a specific query there and then, and these often carry a time factor, such as checking train times or stock prices.

Because users can now access the internet wherever, whenever, there is the expectation that brands and real-time information are also accessible, wherever, whenever.

Micro-moments are also evolving.

Know queries can vary between simple questions [how old is tom cruise] to much broader and complex queries that don’t always have a simple answer.

Know queries are almost always informational in intent.

Know/Informational queries are neither commercial or transactional in nature. While there may be an aspect of product research, the user is not yet at the transactional stage.

A pure informational query can range from [how long does it take to drive to London], to [gabriel Macht imdb]. To a certain extent, these aren’t seen in the same importance as directly transactional or commercial queries – especially by e-commerce website; but they do provide user value, which is something Google looks for.

For example, if a user wants to go on holiday they may start with searching for [winter sun holidays europe] and then narrow down to specific destinations. Users will research the destination further and if your website is providing them with the information they’re looking for, then there is a chance they may also inquire with you as well.

Position Zero

Rich snippets and special content results blocks (i.e., featured snippets) have been a main part of SEO for a while now, and we know that appearing in an SCRB area can drive huge volumes of traffic to your website.

On the other hand, appearing in position zero can mean that a user won’t click through to your website, meaning you won’t get the traffic and the chance to have them explore the website or count towards ad impressions.

That being said, appearing in these positions is powerful in terms of click-through rate and can be a great opportunity to introduce new users to your brand/website.

Go (Navigational Queries)

“Go” queries are typically brand or known entity queries, where a user is looking to go to a specific website or location.

If a user is specifically searching for Adidas, serving them Puma as a result wouldn’t meet their needs.

Likewise, if your client wants to rank for a competitor brand term, you need to make them question why would Google show their site when the user is clearly looking for the competitor.

Defining Intent Is One Thing, User Journeys Another

For a long time, the customer journey is a staple activity in planning and developing both marketing campaigns and websites.

While mapping out personas and planning how users navigate the website is important, it’s necessary to understand how a user searches and at what stage of their own journey they are at.

The word journey often sparks connotations of a straight path and a lot of basic user journeys usually follow the path of landing page > form or homepage > product page > form.

We assume that users know exactly what they want to do, but mobile and voice search has introduced a new dynamic to our daily lives and shape our day-to-day decisions in a way like no other.

These micro-moments directly question our understanding of the user journey. Users no longer search in a single manner and because of how Google has developed in recent years, there is no single search results page.

We can determine the stage the user is at through the search results that Google displays and by analyzing proprietary data from Google Search Console, Bing Webmaster Tools, and Yandex Metrica.

The Intent Can Change, Results & Relevancy Can Too

Another important thing to remember is that search intent and the results that Google displays can also change – quickly.

An example of this was the Dyn DDoS attack that happened in October 2016. Unlike other DDoS attacks before it, the press coverage surrounding the Dyn attack was mainstream – the White House even released a statement on it.

Google Trends Screenshot

Prior to the attack, searching for terms like [ddos] or [dns] produced results from companies like Incapsula, Sucuri, and Cloudflare. These results were all technical and not appropriate for the new found audience discovering and investigating these terms.

What was once a query with a commercial or transactional intent quickly became informational. Within 12 hours of the attack, the search results changed and became news results and blog articles explaining how a DDoS attack works.

This is why it’s important to not only optimize for keywords that drive converting traffic but also those that can provide user value and topical relevancy to the domain.

Source: This article was published searchenginejournal.com By Dan Taylor 

When you have limited time and resources, how do you choose which channel to focus on? Columnist Jordan Kasteler lays out the pros and cons of email marketing and social media to help you decide.

If you’re a busy professional with a digital company, you’ve likely lamented a thousand times over where to focus your limited resources. Email marketing and social media are two marketing tactics with a bundle of buzz, but which will give you the most efficient and effective results?

Email and social media are two completely different beasts and could serve two separate purposes in your overall strategy. To narrow your focus, we have to first get clear on what you’re after.

That said, keep in mind that it is not necessary to choose one or the other; each has its own place, and benefits and should be used in tandem to expand your business to new audiences and levels of success. But it’s always advantageous to have a core focus and know which modalities bring the most bang for your business buck.

Email marketing is a powerful mainstay

Email marketing is the little engine that could. It’s one of the only evergreen strategies that has worked since the web first landed.

This year, the number of worldwide email users will grow to over 3.7 billion, according to a 2017 report from The Radicati Group. By 2021, that number will climb to over 4.1 billion.

Gmail alone touts over 1 billion of those users.

When we compare this figure to the reach of social media, roughly 2.5 billion users, it is clear that email takes this round. According to Statista, social adoption will only increase to 2.95 billion by 2020 — still a far cry from email’s reach.

It’s this massive potential that has allowed email marketing and campaign management services like GetResponse to thrive and incorporate other powerful features like webinar solutions, custom landing pages and automation elements.

Additionally, one of the most significant benefits that email holds over social is that communications will reach their intended recipients about 90 percent of the time.

Email marketing, however, is not as easy as it seems.

First, the 3.7 billion email users are not all accessible the way they are on social media.

Moreover, email lists need to be carefully refined to only reach the most interested and qualified prospects; that means you can’t just buy a list of digital addresses and expect your email blasts to turn a profit; emails lists must primarily include people who actually want to hear from you.

Another downfall is that emails have to jump through a variety of hoops before safely landing in a person’s inbox. While most people will end up receiving your communications, poor email designs and content can cause messages to be labeled as spam. That takes all your efforts and tosses them into the digital trash bin. This is quite the double whammy when you consider the challenges associated with building an email list and gaining new subscribers.

For B2C emails, however, one of the biggest troubles is getting to know your audience well enough to tailor communications that will convert; companies need to understand their customers’ habits and tendencies in order to segment them properly and recommend relevant deals or products.

Additionally, B2C brands need to study their open rates to gain insights on the most beneficial times to send communications, so that a consumer is more likely to convert.

And for consumer-facing companies, in particular, email elements such as compelling copy, relevant calls to action, feature placement and the ever-imperative mobile optimization are critical challenges.

Without all these elements in place, the chances of recipients opening an email, clicking a contained link, and making a purchase from the web store are nil.

Social media has marketing superpowers

Social media has become a cultural phenomenon, evolving into a deep-rooted marketing necessity around the time that the Web 2.0 revolution began.

In that period, social media platforms have matured and transformed to become a marketer’s best friend.

All of the major players in the space now support hyper-targeted advertising, entertaining and dynamic content, massive reach capabilities and a variety of advertising channels for marketers to leverage. Most platforms also incorporate social selling opportunities like Facebook’s “shop” sections and Instagram’s shoppable posts.

These kinds of features allow businesses to seamlessly offer consumers products with unparalleled convenience.

Social media also helps propagate brand awareness in a way that has never been possible. Businesses can post blogs, updates, videos and other forms of content that users can then share with their friends, who share with their friends and so on.

This not only keeps consumers educated in real time but also pulls more prospects into marketing funnels by generating more awareness/interest, and even produces the potential for content to go viral.

As a natural byproduct, social media’s uploaded content and discoverability also tend to drive increased levels of traffic to websites, potentially leading to more conversions and higher rankings in the SERPs.

And did I mention that social media is 100 percent free to use? If you aren’t leveraging paid ads, that is.

But as you well know, there are some massive social pain points that must be considered.

Organic reach on social media has been declining at alarming rates over the past several years in accordance with various algorithm updates. Late last year, it was uncovered that publishers saw a 52 percent decrease in organic reach over the course of 2016.

Just three months later, The Wall Street Journal published a piece revealing that Facebook had come clean on miscalculating organic reach in the Page Insights dashboard:

…Facebook found that it had been over counting how many people were exposed to marketers’ organic posts, meaning regular posts that weren’t paid ads, because it was adding up the daily reach over certain periods without accounting for repeat visitors. The corrected metric on average will be about 33 percent lower for the seven-day period and 55 percent lower for the 28-day period…

But even if a brand’s posts are seen by the intended audience, it takes far more nurturing to turn social crowds into customers. This often equates to inflated expenditures relating to social ads, images, tools, content and other educational or sales-related materials.

And since 78 percent of consumers read reviews before buying, your brand’s page reviews had better be stellar if you have any hope of gaining a new customer.

Setting aside social media’s increasingly “pay to play” environment and other challenges, one of the final downsides to social is how businesses manage their social existence.

All too often, brands spread themselves far too thin by trying to participate on Facebook, Twitter, Instagram, Snapchat, Google+, Pinterest, Tumblr, LinkedIn and other popular networks. Without a refined strategy, it’s extremely easy to get lost in the digital noise.

Which channel should you use?

As far as B2C organizations are concerned, email is going to be a more beneficial and prosperous method for driving more sales and cultivating loyalty among consumers. Considering that emails will land in a customer’s inbox more times than not, it’s essential to study your audience and deliver more personalized messages, send communications at the right times and structure emails to allow for the most engagement and conversions possible.

While email is more fruitful for B2C, social still does a lot of heavy lifting in terms of generating awareness, website visits, increased email subscribers and brand loyalty as consumers continue to engage with a company.

If you can only do one, email is your champion. If you can do both, then you have a recipe for continually building, nurturing and converting leads in an exponentially powerful way.

Source: This article was published marketingland.com By Jordan Kasteler

Google has released a new API for publishers which uses machine learning to filter out hurtful comments online. The new tool, called Perspective, can be integrated into any publishing platform which utilizes online comments.

Perspective cross-references comments against a human-generated database of comments that have already been labeled as offensive. When a new comment is posted, Perspective will rate it based on how closely it resembles the hundreds of thousands of abusive comments in Google’s database.

If it has been determined that a comment is abusive, Perspective will notify the commenter in real time, which you can see in the example below:



Comment sections have become so volatile as of late that many publishers have opted to remove comment section altogether, rather than trying to moderate them. Perspective aims to make it easier to moderate abusive comments before they go live.

Google revealed Perspective is currently being tested with the New York Times’ comment section. As more sites integrate the API with their publishing platform, Perspective will continue to learn and grow its database of toxic comments, making it even more efficient over time.

Currently, Perspective is only able to flag abusive comments left in English, but that could change over the next year if it ends up catching on with publishers.

Source: This article was published on searchenginejournal.com by Matt Southern

Wellingtonian James Jordan Winstanley set up a "dark net" website so he could buy drugs. (File photo)

A  young Wellington man with "mad computer skills" made a stupid mistake when he set up a "dark net" website to help him buy drugs, a judge has said.

James Jordan Winstanley, 22, also set up a Facebook page that led to the Vic Underworld site, which users then had to be invited to use.

However, Wellington District Court was told on Thursday that there was no evidence he supplied any drugs, and a police search of his home found no drugs.

Judge Bill Hastings said Winstanley did not have a formal diagnosis of autism, but it was clear he had "social deficits", and had in the past used prescribed medication for himself and others.

"You are an intelligent young person with mad computer skills who also has certain social deficits which you have taken significant steps to overcome," the judge said.

"You made a stupid mistake when you set up this website."

The "dark net" is the backroom of the internet, where sites such as online drugs supermarket the Silk Road are hidden behind encryption systems. Sites are considered to be on the dark web if they can be accessed only by using special networks, designed to give visitors some degree of anonymity.

Winstanley's lawyer James Elliott said his client set up Vic Underworld to help him buy drugs, rather than use it as a marketplace. Winstanley knew no other way of getting drugs, and turned to what he did know, which was information technology. 

He posted on the site using fake identities, to make it seem better used than it was.

Facebook took down the initial page, but Winstanley put up two more. The judge said this showed defiance and persistence.  

He was caught as part of Operation Hyperion, in which police and Customs in co-ordination with international agencies targeted the buyers and sellers of illegal drugs.

He pleaded guilty to three charges of offering to supply MDMA, codeine and cannabis between May and October 2015.

Hastings told Winstanley he had come to his "day of reckoning", and sentenced him to four months' community detention and six months' supervision.

Source : This article was published stuff.co.nz

Angels & demons.

Good & bad.

The full spectrum of the human psyche. Being human.

People do amazing things everyday and carry out heart-warming acts of kindness and love.

There are others who demonstrate atrocities and acts of pure evil.

This week's account is from a gentleman called Donavon. At the time he was 10 years old and lived in Penkhull. The year was 1984.

Don, now 43; was brought up in a devout Catholic background and spent many days reading and learning about the bible and its many teachings.

"Its was something I loved to do," says Don. "Learning about Christ, and his sacrifices, taught me a lot about life and myself to be honest.

"My father was involved with the local church and we would spend a lot of our free time at the church together – I have really fond memories of growing up."

Living with his father, mother and younger twin sister, Don enjoyed a happy care-free childhood.

Until one day in late 1984 when he was alone in the kitchen.

"It started with a buzzing sound, like someone had left something electrical switched on," says Don.

"It was a noise I hadn't heard before. I looked but couldn't find the source – it seemed to move around the room, from one side to the next.

"Then I noticed a smell, it was like rotten meat - a terrible smell that really unnerved me."

Then early one morning - at about 3am - Don awoke to a noise coming from outside his bedroom door.

"It sounded like an animal was sniffing under the door which was scary as we didn't have a dog or cat.

"I froze in the bed and didn't dare move."

Still the animal could be heard sniffing under the door as if it where hunting for food......or someone.

"It sounded big and I seem to remember it even grunted at one point," says Don.

Petrified, he began to scream for help. The noise stopped when his father and mother ran into his room.

"That's when I smelt the rotting meat again and the buzzing noise started up – with my confused and terrified parents being witness to it this time as well. All three of us heard it."

The following day Don's parents brought a priest in to the home for a cleansing. It worked and they never experienced anything like it again.

Humans can be pure or evil.

So can spirits.

Author: The Sentinel
Source: http://www.stokesentinel.co.uk/supernatural-staffordshire-the-family-called-in-a-priest-after-a-demon-entered-their-home/story-30257934-detail/story.html

Sunday, 09 April 2017 17:09

How to avoid falling for email scams

Early one Sunday morning, my editor, Yahoo Finance’s Erin Fuchs, checked her personal email and was surprised to find a message from PayPal (PYPL). The missive said she had recently changed her password, and asked her to call a phone number if that wasn’t the case.

It wasn’t, so Fuchs called. The email had come from a “This email address is being protected from spambots. You need JavaScript enabled to view it.” address and included a link to the PayPaypal website. However, she became suspicious when the person on the other end of the line asked for her credit card information to “verify her account.”

Phishing email.

It doesn’t matter who you are or what email service you use. If you have an email account, you’ve received some kind of scam, or phishing email, just like my editor.

Most of the time, these emails are relatively easy to spot. Some African prince or other wealthy individual wants to send you money until he can make it to the US. You just need to send your bank account information and Social Security number.

But criminals are quickly changing their tactics, firing off more sophisticated emails in an attempt to trick you into giving away your personal information. According to Gary Davis, chief consumer security evangelist at Intel (INTC) Security, in a recent study, more than 19,000 people were asked to look at 10 emails and identify which ones were scams. Only three percent of them were able to find all of the phony messages.

Worse still, some phishing messages contain ransomware, which locks down your entire computer until you pay the culprits a ransom.

Yes, it’s a scary world out there. But there’s hope. If you follow some of these quick tips, you’ll be able to stay one step ahead of the bad guys.

Read the subject line and sender’s address

Phishing emails are designed to sucker as many victims as possible. They cast a wide net by covering topics like banking and package deliveries—two things most people generally receive emails for.

You should be on high alert if you get a message from an unknown sender with a subject line mentioning changes to your bank account—or that you need to pick up a package that can’t be delivered—and you aren’t expecting either of those things. It’s probably a phishing attempt.

Just delete the message and move on with your life.

Hover over links

Okay, so you can’t remember if you changed your bank account info or aren’t sure if you have a package in the mail, so you open the email. That’s cool. As Intel Security’s Gary Davis explains, it’s rare that just opening a message executes any kind of code on your computer.

Phishing emails.

The message, however, tells you to click a link to check out the changes to your account or the status of your package. What do you do? Simple: Hover your mouse over the URL. When you point to a link without clicking, most web browsers and email programs automatically display the web address that link will open. If the email says it’s from your bank or delivery service, but the link points to a different site, don’t click it.

Urgency is suspect

A good number of phishing emails try to get you to act before you think—by adding a sense of urgency to their messages. An email telling you to log into or verify information for your bank or other account labeled “Final Warning” or “Urgent Notification” should set off warning bells right away.

Kevin Haley, director of product management for Symantec’s (SYMC) Security Response, explains that you should be suspicious if you receive an email with a URL or attachment that is trying to get you to click on something right away.

An scam email ordering you to do something immediately.

Russian agents are widely considered to have used this exact method to break into the Democratic National Committee’s server’s via a phishing email.So if you get a message telling you to do something instantly, ignore it. If you think it’s legitimately from your bank, skip the link and just go directly to your company’s website.

Hooked on phonics

As Microsoft points out in its phishing email primer, legitimate businesses hire professionals to ensure that communications with customers are mistake-free. Criminals? Not so much. So if you get an email that’s strangely formatted, and is loaded with enough grammar issues to drive your fifht-grade English teacher insane, delete it." data-reactid="66" style="margin: 0px 0px 1em;">The easiest way to identify a phishing email is if it’s loaded with grammatical or spelling errors. As Microsoft points out in its phishing email primer, legitimate businesses hire professionals to ensure that communications with customers are mistake-free. Criminals? Not so much. So if you get an email that’s strangely formatted, and is loaded with enough grammar issues to drive your fifht-grade English teacher insane, delete it.

Spam email with poor grammar.

Patience is a virtue

A lot of people fall victim to phishing emails because they’re simply in a rush. They’re in the middle of cooking dinner and taking care of two toddlers, see an email from their bank and BAM, that’s that. So how do you fix this? Just take a few minutes, breathe, and read your emails carefully. That’s pretty much it.

What to do when you’re hooked

So you’ve clicked a link or downloaded an attachment in a phishing email. You’re done for, right? Not exactly.

Both Davis and Haley suggest that if you realize you’ve been the victim of a phishing scheme and you’re fast enough, you can change your passwords on any affected websites before the criminals get access to your accounts. If you can’t do that, your best bet is to disconnect your computer from the internet and run an antivirus program.

Disconnecting your computer (like turning off WiFi) ensures that any malware you downloaded can’t communicate with its home server and steal your information; meanwhile, the antivirus program takes care of anything on your machine. You should also enable two-factor authentication on your accounts, which requires that you enter both your password and a second string of characters usually sent to your smartphone via text or an app, to keep people from accessing your information. 

If, however, you’ve given your private information to someone via email, well, your best bet is to use a credit-monitoring service to make sure that no one is opening credit-card accounts in your name.

Author: Daniel Howley
Source: https://www.yahoo.com/tech/how-to-avoid-falling-for-email-scams-174835308.html

Internet service providers have warned that using WhatsApp offline can expose subscribers to hacking and malicious viruses.

This was contained in a message by Airtel Nigeria, which advised Nigerians, especially subscribers on its network, to be vigilant in accepting certain messages.

It said on Tuesday in Lagos that there had been messages in circulation which tend to show that a subscriber could make use of WhatsApp without access to the internet.

“Dear customer, our attention has been drawn to messages notifying customers of the use of WhatsApp without internet.

“Kindly ignore and do not click on those links, as it redirects to cloned applications.

“The links may be used to harvest sensitive information from your device. Be cautious,” Airtel said in a text message to its customers.

The News Agency of Nigeria reports that since the beginning of 2017, the message has been circulating, while many may have fallen victims.

The hackers’ message usually reads, “First, you need to update your WhatsApp iOS to the latest WhatsApp version 2.17.1.

“Now, this allows sending the message to any contact in your list without having internet connection.

“This feature was available on Android for more than a year, but iOS users are only getting it now.

“Also, you will be getting an option to send 30 photos or videos at a time if you update your WhatsApp.”

The message has been certified to be a hoax, and should be disregarded. (NAN)

Source : punchng.com

Industrial companies are planning to commit approximately $907 million annually to their Industrial Internet of Things (IIoT) initiatives, according to a new PwC report launched in coincidence with the Global Manufacturing and Industrialisation Summit (GMIS) in Abu Dhabi.

The Chief Information Officer’s (CIO’s) role in defining a company’s strategy has become more important than ever, the report affirmed.

In its latest report, PwC said that managing the transition to the Industrial Internet of Things (IIoT) will be a highly complex task, which CIOs cannot afford to miss out on. It points to studies that show that by 2020, companies will likely spend $1.7 trillion a year on the combined industrial and consumer Internet of Things (IoT) .

That transformation to IIoT is materialising fast, cited a recent PwC Industry 4.0 Survey which found that industrial companies are planning to commit approximately $907 million annually to their IIoT initiatives. Those companies expect $421 billion in cost reductions and $493 billion in increased revenues annually from the implementation of IIoT, with 55 percent expecting a payback within two years.

The sheer size of the Industrial IoT opportunity, which PwC says far outweighs all expectations of the consumer oriented IoT, means that CIOs will have to take centre stage in leading a digital transformation that aligns strategy and technology with the manufacturing environment and the manufactured product.

Dr. Anil Khurana, Partner, Strategy & Innovation at PwC Middle East and the report’s lead author, said:”The IIoT will place huge demands on the CIO. It is indeed an opportunity that few will want to miss. First-mover status is critical to gaining a competitive edge as companies begin moving en masse to reap the benefit of digitization. Our research into the IIoT domain suggests that CIOs take six important steps towards their companies’ future digital transformation, which has been outlined at length in the report. These steps include key elements such as the development of a digital strategy, building capabilities and eventually, initiating pilot programmes.”

“Supporting the GMIS vision to promote manufacturing and industrial innovation; driving towards sustainable development; and contributing to wealth generation and prosperity, PwC has facilitated connections between enterprises of all sizes that are now embracing the 4th Industrial Revolution, or 4IR, and embracing IIoT. PwC has facilitated the development of the pilot programmes being discussed and presented at GMIS,” Dr. Khurana added.

Commenting on the report, Badr Al-Olama, Chief Executive Officer, Strata Manufacturing, and Head of the Global Manufacturing and Industrialisation Summit Organising Committee, said: “For the manufacturing sector, the Industrial Internet of Things is at the heart of 4IR. As PwC points out in this report, the CIO is the key driver in helping organisations to adopt IIoT, aligning business strategy with technology transformation. Their role is to ‘normalise’ innovation in large, complex organisations, drawing on a new capacity to intelligently connect people, processes and data through devices and sensors” ” For manufacturers, this creates the prospect of the digital factory where ‘smart’ manufacturing technologies are controlling energy, productivity and costs through real-time monitoring and application of data insights. PwC’s report sets out a roadmap for IIoT transformation, prepping the experts – including CIOs from leading global manufacturers – to put together a vision for manufacturing that is based on the 4IR technologies” he explained.

Author : Shayne Heffernan

Source : livetradingnews.com

Page 1 of 5

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media

Book Your Seat for Webinar GET FREE REGISTRATION FOR MEMBERS ONLY      Register Now