fbpx
Association Admin

Association Admin

To me, deciding on my 'Smartphone of the Year' is a curious challenge. The choice can't simply be 'the best phone' because everyone has a slightly different criteria for what makes the best phone. If I were to think about it empirically and go for the phone that fits the majority of people's criteria I wouldn't have the best phone, I would have 'the average phone of the year' that upsets the least number of people.

For a smartphone to pick up my personal award it needs to say something about itself, about the manufacturer behind it, and it needs to reflect the smartphone industry over the last twelve months.

So, with just a little bit of scene-setting and discussion about the phones I'm placing in third and second place, let's find out my smartphone of 2016.

Third Place: Jolla C, by Jolla

I've known that the Jolla C would be in the running for a long time for the award, because for the middle six months of the year it was the perfect use of 'proof by negation' of what the smartphone industry required from a smartphone in 2016.

The Jolla C hardware might look a touch underpowered, although it has been built to a very low price of around 170 Euros. With a SnapDragon 212 System on chip, 2 GB of RAM, 16 GB of storage and a 2500 mAh battery, the real strength is in the software. It runs a 'clean' version of Sailfish OS which flies even on these apparently low specifications.

Around one thousand handsets were released (as 'developer editions') and offered over the summer months - a short run that was almost instantly snapped up by the faithful. It made some waves online, but no more. Here was a small company, making the hardware, putting on the software, and distributing the machine. Sailfish OS is compact, designed for a 'buttonless' smartphone relying solely on touchscreen input, with genuine multitasking on top of a robust Linux-based OS. It's robustness was proved on this low-priced Nexus-like device.

Author: Ewan Spence
Source: http://www.forbes.com/sites/ewanspence/2016/12/31/iphone-7-plus-galaxy-s7-edge-jolla-smartphone-of-the-year/#182255f6d1ff

Sunday, 25 December 2016 18:41

The Most Stunning Space Photos of 2016

YOU DON’T HAVE to visit a galaxy far, far away to see gorgeous images of space. Leave the interstellar antics to Jyn Erso, and check out WIRED’s selection of the most stunning photos in the universe, from burping black holes to exploding supernovas.

From the launch of Juno’s 20-month orbit of Jupiter to the SpaceX landing (and explosion) to the possible discovery of an exoplanet orbiting Proxima Centauri, 2016 has been full of accomplishments in space. But this year, NASA also lost one of its greatest stars: John Glenn, fearless as the first American to orbit Earth, and, later, the oldest.

Space photos allow everyone to feel a tiny bit of the wonder Glenn experienced looking down on our planet. It’s the kind of awe that keeps you coming back. So enjoy these incredible images from the great beyond. May there be many more to come.

SPoW16_018.jpg

BSP_063.jpgSPoW16_022.jpg

BSP_046.jpg

Source : https://www.wired.com/2016/12/stunning-space-photos-2016/#slide-12

Author: Naveed Manzoor [Toronto, Ontario] 

The internet is humongous. Finding what you need means that you should select from amongst millions and sometimes trillions of search results. However, no one can claim for sure that you have found the right information. Is the information reliable and accurate? Or would you have to shop for another set of information that is even better? Or say, relevant to the query – While the Internet keeps growing every single minute, the clutter makes it even harder to catch up with, and perhaps, a more valuable information keeps getting buried underneath it. Unfortunately, the larger the internet grows, it gets harder to find what you need.

Think of search engines and its browsers to be a set of information search tools that will fetch what you need from the Internet. But, a tool is as good as the job it gets done. While, Google, Bing, Yahoo and the like are considered a more generic tool for Internet search, they perform a “fit all search types job”. The search results throw tons of web pages at you and thus, much harder selections and surely less accuracy. 


A simple solution to deal with too much information on the Internet is out there, but only if you care to pay attention – here is a List of Over 1500 Search Engines and Directories to cut your research time in half.


There exists a whole new world of Internet search tools that are job specific and finds that information you need through filtered and precision search. They subscribe to the same world wide web and look through the same web pages as the main search engines, but only better. These search tools are split up into Specialized Search Engines and Online Directories.

The Specialized Search Engines are built to drill down into a more accurate type of information. They can collect a filtered and less cluttered search results when compared to the leading search engines such as Google, Bing, Yahoo. What makes them unique is their built-in ability to use powerful customized filters, and sometimes it has its database to deliver the type of information you need in specific file formats.

We will classify Specialized Search Engines into Meta-crawlers (or Meta-SearchEngine) and the Specialized

Content SearchEngine

Unlike conventional search engines, the Meta-crawlers don’t crawl the web themselves, and they do not build their own web page indexes; instead, they allow search snippets to be collected (aggregated) from several mainstream search engines (Google, Bing, Yahoo and similar) all at once. They don't have their proprietary search technology or the large and expensive infrastructure as the main search engines do. The Meta-crawler aggregates the results and displays these on their proprietary search result pages. In short, they usually concentrate on front-end technologies such as user interface experience and novel ways of displaying the information. They generate revenues by displaying ads and provide the user option to search for images, audio, video, news and even more options, simulating a typical search browsing experience.

Some of the well-known Meta-Crawlers to explore.

  • Ixquick  -  A meta-search engine with options for choosing what the results should be based on? - It respects the information privacy, and the results get opened in Ixquick proxywindow.
  • Carrot Search   A meta-search engine based on a variety of search engines. Has clickable topic links and diagrams to narrow down searchresults.
  • iBoogie  -  A meta-search engine with customizable search type tabs. Search rankings havean emphasis onclusters.
  • iSeek  – The meta-search results are from a compilation of authoritative resources fromuniversity, government, and established noncommercialproviders.
  • PDF Search Engine  – Searches for documents with the following extensions such as, .doc, .pdf, .chm, rtf, .txt.

The Specialized Content Search Engine focuses on a specific segment of online content; that is why they are also called a Topical (Subject Specific) Search Engines. The content area may is based on topicality, media and content type or genre of content – further to this, the source of material and the original function it performs in transforming it, is what definestheirspecialty.

We can go a bit further and split these into three groups.

Information Contribution – The information source can be data collected from a Public Contribution Resource Engines as social media contributions and from reference platform such as Wikis. Examples are YouTube, Vimeo, Linked-in, Facebook, Reddit. The other types are a Private Contribution Resource Engines of the searchable database. These are created internally by the efforts of the search engine vendors; examples are Netflix (movies), Reuters (news content), Tineye (image repository), LexisNexis (legal information).

Specialized Function - These are the search engines that are programmed to perform a type of service that are proprietary and unique. They execute tasks that involve collecting web content as information and work on it with algorithms of their own, adding value to the result it produces.

An example of such types of search engines are websites such as the Wayback Machine Organization that provides and maintain records of website pages that are no longer available online as a historicalrecord. Alexa Analytics that performs web analytics and measures traffic on websites and provide performance metrics and Alpha Wolfram who is more than a search engine. It gives you access to the world's facts and data and calculates answers across a range oftopics.

Information Category (Subject Specific material) - This is where the search is subject specific and based on the information it retrieves. It does this by a special arrangement with outside sources on a consistent basis. Some of their examples are found under the broader headings.

  • Yellow Pages and PhoneDirectories
  • PeopleSearch
  • Government Database andArchives
  • PublicLibraries
  • News Bureaus, Online Journals, andMagazines
  • InternationalOrganizations

WebDirectory or Link Directory is a well-organized catalog on the World Wide Web. A collection of data organized into categories and subcategories. This directory specializes in linking to other web sites and categorizing those links. The web directory is not a search engine, and it does not show  numerous web pages  from the keyword search. Instead, it exhibits a list of website links according to category and subcategory. Most web directory entries are not commonly found by web crawlers. Instead, they are searched by humans. This categorization encompasses the whole website instead of a single page or a set of keywords; here the websites are often limited to inclusion in only a few categories. Web directories often allow site owners to submit their site for listing and have editors review submissions for its fitness.

The directories are distinguished into two broad categories.

Public Directories that do not require user registration or fee; and the Private Directories with an online registration that may or may not be subject to a fee for inclusions in its listings. Examples of Paid Commercial Versions.

The Public Directories is for General Topics, or it can be Subject Based or Domain-Specific Public Directories.

The General Topics Directory carry popular reference subjects, interests, content domains and their subcategories. Their examples are, DMOZ  (The largest directory of the Web. The open content is mirrored at many sites, including the Google Directory (until July 20, 2011). The A1 Web Directory Organization (This is a general web directory that lists various quality sites under traditional categories and relevant subcategories). The PHPLink Directory  ( A Free Directory Script phpLD is released to the public as a free directory script in 2006, and they continue to offer this as the free download).

The Subject Based or Domain-Specific Public Directories are subject and topic focused. A more famous of these are Hot Frog (a commercial web directory providing websites categorized topically and regionally). The Librarians Index to Internet (directory listing program from the Library ofCalifornia) and OpenDOAR  (This is an authoritative directory of academic open accessrepositories).

The PrivateDirectories requires online registration and may be subject to a fee for inclusions in its listings.

Examples of Paid Commercial Versions.

  • Starting Point Directory - $99/Yr
  • Yelp Business Directory - $100/Yr
  • Manta.com - $299/Yr

The Directories that require registration as a member, employee, student or a subscriber.  Examples of these types are found in.

  • Government Employees Websites (Government Secure Portals)
  • Library Networks (Private, Public and Local Libraries)
  • Bureaus, Public Records Access, Legal Documents, Courts Data, Medical Records

The Association of Internet Research Specialists (AIRS) have compiled a comprehensive list they call an "Internet Information Resources." There you will find an extensive collection of Search Engines and interesting information resources for avid Internet research enthusiasts; especially, those that seek serious information without the hassle of sifting through the many pages of unfiltered Internet. Alternatively, one can search through Phil Bradley’s website or The Search Engine’s List that has some interesting links for the many alternative to typical search engines out there. 

Searchmetrics’ annual study of top Google ranking factors undergoes radical format shift to match industry-specific needs and results.

SAN MATEO, Calif., December 13, 2016 ‒ Today’s search results are shifting dramatically to match answers to the perceived intent of a search, as Google and other search engines increasingly employ deep learning techniques to understand the motivation behind a query, according to key findings in a new Searchmetrics study.

Findings from the latest Searchmetrics Ranking Factors study, “Rebooting for Relevance,” suggest marketers face new challenges as Google deemphasizes traditional ranking factors such as collecting more backlinks and employing enough focus keywords in text. As technical SEO factors become table stakes in online content strategies, marketers in various industries will be forced to adopt new techniques to succeed,” said Marcus Tober, Searchmetrics founder and CTO.

“Google revealed last year that it is turning to sophisticated AI and machine-learning techniques such as RankBrain to help it better understand intent behind the words searchers enter, and to make its results more relevant,” Tober says. “User signals such as how often certain results are clicked and how long people spend on a page help the search engine get a sense of how well searchers’ questions are answered. That allows it to continually refine and improve relevance.”

The findings come from Searchmetrics’ annual study of Google ranking factors, which analysed the top 20 search results for 10,000 keywords on Google.com. The aim of the analysis (carried out every year since 2012) is to identify the key factors that high ranking web pages have in common, providing generalized insights and benchmarks to help marketers, SEO professionals and webmasters.

“The most relevant content ultimately ranks by trying to match user intent - whether a searcher is looking to answer a question quickly, shopping or researching,” Tober says.

“Someone who types ‘who won Superbowl 50?’ wants a single piece of information, while a query like ‘halloween costume ideas’ is most likely to best feature a series of images,” Tober explains. “A query on ‘how to tie a Windsor knot’ might be best served with video content. Our research suggests Google is getting better at interpreting user intent to show the most relevant content.”

Here are five indications from this year’s study that suggest Google is getting better at showing the most relevant results:

1. High ranking pages are significantly more relevant than those that appear lower

Higher ranking search results are significantly more relevant to the search query than those lower down, according to the study, an indication that Google recognises when content is more relevant, and then gives it a rankings boost. It’s also clear it is not simply based on a crude analysis of the number of times web pages mention keywords that match those entered in the search box.

In this year’s study, Searchmetrics has used Big Data techniques to calculate a Content Relevance score[1], a new factor that assesses the semantic relationship between the words entered in search queries and the content shown in results; in effect, it measures how closely they are related. To make Content Relevance more meaningful, its calculation actually excludes instances of simple keyword matches between search queries and search results.

In general the Content Relevance scores of results positioned near the top are higher, suggesting that Google knows when content is more relevant and then places it more prominently. This rule does not apply to results found in positions 1 and 2, which tend to be reserved for top brand websites - presumably because Google considers content from more recognisable and trusted brands will better serve searchers’ needs than non-brand pages that might have slightly more relevant content. Results with the highest Content Relevance scores appear in results found in positions 3 to 6.

2. Word count is increasing on pages that rank higher, while keyword mentions fall

The number of words on higher ranked pages has been increasing for several years now, and this trend is continuing. According to Searchmetrics, this is because top performing results are more detailed, more holistic (cover more of the important aspects of a topic) and are hence better able to answer search queries.

But interestingly, even as text grows longer, the number of keywords (words that match the search query) on higher ranked pages is not increasing. This is because Google is no longer just trying to reward pages that use more matching keywords with higher rankings; it is trying to interpret the search intention and boosting the content that is most relevant to the query.

In fact, the top 20 results include 20% fewer matching keywords (on average) in the copy than in 2015. Also in 2016, just 53% of the top 20 results have the keyword in the title (compared with 75% in 2015). Less than 40% now have the matching keyword in H1 title tag (usually used in the HTML of web pages to tell search engines what the page is about).

On average, pages appearing in desktop results are a third longer than those appearing in mobile search results.

3. User signals suggest Google increasingly guides searchers to exactly the right result

If Google was presenting precisely the right results to answer searchers’ queries, then more of them would visit those pages, take in what is there and leave without having to look elsewhere (having found exactly want they were looking for).

That is just what seems to be happening. Searchmetrics’ analysis of user signals indicates that bounce rates (when a searcher visits a page and leaves without clicking more pages on the same site) have risen for all positions in the top 20 search results, and for position 1 have gone up to 46% (from 37% in 2014). This is not because more people are bouncing away from pages immediately - having found that the content does not answer their question. Because time-on-site has also increased significantly over previous years, with people spending around 3 minutes 10 seconds on average when they visit pages listed in the top 20 results.

4. Backlinks: The rise of mobile search is making them less important

The number of backlinks coming into a page from other sites has always been an important common factor among high ranking pages. It still has a strong correlation with pages that rank well. However, it is on a downward trend as other factors such as those related to the content on the page become more important.

As well as the growing importance of content related factors, backlinks are becoming less important because of the rise of mobile search queries: pages viewed on mobile devices are often ‘liked’ or shared but seldom actively linked to.

5. Google shows longer URLs to answer search queries, not just optimised short-URL home and landing pages

Until now, marketers and SEO professionals have been able to use optimization techniques to help their site’s homepage or favored landing pages rank higher. But the study shows that the URL addresses for pages that feature in the top 20 search results are around 15% longer on average than in 2015. Searchmetrics’ hypothesis is that instead of the highly optimised home and landing pages that marketers might prefer to appear in searches (and which tend to have short, tidy URLs), Google is better able to identify and display the precise pages that answer the search intention; these pages are more likely to have longer URLs because they possibly lie buried deeper within websites.

Other important findings include:

  • Technical factors such as loading time, file size, page architecture and mobile friendliness are a prerequisite for good rankings, as these factors help to make web pages easily accessible and easy to consume for both humans and search engines. They lay the foundation for breaking into the top 20 search results with the quality and relevance of the content enabling further rankings.
  • There are significant differences between high ranking content on desktop devices and that which appears on mobile devices. For example, high ranking mobile results tend to have faster page load speeds, smaller file sizes, shorter wordcounts and fewer interactive elements such as menus and buttons.

For marketers and SEO professionals the advice from the study is clear, Tober says:
“Since Google is becoming much more sophisticated about how it interprets search intent and relevance, you also need to work harder and be smarter at understanding and delivering on these areas in content you put on your websites. You need to use data-driven insights to analyze exactly what searchers are looking for when they type specific queries in the search box, and make sure your content answers all their questions clearly and comprehensively in the most straightforward way – and you need to do it better that your competitors.”

Google’s application of machine learning to evaluate search queries and web content means that the factors it uses to determine search rankings are constantly changing. They are becoming fluid and vary according the context of the search (is it a travel search? An online shopping search? etc.), and according to the intention behind each individual query. Because of this, Searchmetrics will in future no longer be conducting a single generalized universally applicable ranking factors study. Instead, in the coming months, it will be publishing a series of industry-specific ranking factors studies focused on verticals such as ecommerce, travel, finance and more.

To download the new Searchmetrics Ranking Factors whitepaper, please visit: 
http://www.searchmetrics.com/knowledge-base/ranking-factors/

[1]Content Relevance is based on measurement methods that use linguistic corpora and the conceptualisation of semantic relationships between words as distances in the form of vectors. For the semantic evaluation of a text, this makes it possible to analyse the keyword and the content separately from one another. We can calculate a content relevance score for a complete text on a certain keyword or topic. The higher the relevance score, the more relevant the content of the analysed landing page for the given search query.

About the study

As in previous years, the study analysed Google US (Google.com) search results for 10,000 keywords and 300,000 websites, featuring in the top 30 positions. For some factors, a more in-depth analysis required the definition of specially-defined keyword sets. The correlations between different factors and the Google search rankings were calculated using Spearman's rank correlation coefficient. To provide maximum context this year’s desktop search analysis has been compared either with the equivalent desktop data from previous years or with the mobile data from 2016.

About Searchmetrics

Changing search technology has forced SEO platform providers to up their game. These changes have created an entirely new search paradigm − search and content optimization. And since search engines have put a fence around a lot of their data, SEO platforms need to bring their own rich data to the party − and powerful tools to analyze it.

There’s only one search platform that owns its data: Searchmetrics, the world’s #1 SEO and content performance platform. We don’t rely on data from third parties. Our historical database spans nine years and contains over 250 billion pieces of information, such as keyword rankings, search terms, social links and backlinks. It includes global, mobile and local data covering organic and paid search, as well as social media. We have the largest global reach of any SEO platform, crawling the Web every day in more than 130 countries.

Searchmetrics monitors and reveals the full business available to you online. We provide our customers with a competitive advantage and help them identify new business opportunities by exposing the content consumers are engaging with on industry and competitors’ sites. Our Visibility Score − trusted by reputable media sources such as The New York Times, Bloomberg and The Guardian − reliably indicates your online presence.

We provide the insights our customers need to deliver results. Searchmetrics guides SEOs and content marketers with suggestions for creating content that improves relevance and boosts conversions. It shows the connection between social media links and overall engagement. And its analytics make clear which content performs the best and how an organization’s content performs against its competitors.

With Marcus Tober, one of the top 10 SEO minds in the world, leading Searchmetrics’ product development, we have over 100,000 users worldwide, many of whom are respected brands such as T-Mobile, eBay, Siemens and Symantec. They depend on Searchmetrics and our 12 years of product innovation to maximize their online performance.

More information: www.searchmetrics.com.

Media Contacts:
Cliff Edwards
Searchmetrics Inc.
San Mateo, Calif.
650.730.7091

Uday Radia
CloudNine PR Agency
This email address is being protected from spambots. You need JavaScript enabled to view it.
+44 (0)7940 584161

Source : http://www.realwire.com/releases/Searchmetrics-Ranking-Factors-Rethinking-Search-Results

Facebook is undoubtedly the biggest social media platform today, making it among other things, a target for hackers on darknet markets

Stolen data are a popular buy on various darknet markets for criminals looking for new identities to hide their clear web activities.

As such, data breaches like the theft of Facebook usernames and passwords are not uncommon.

In a bid to protect its users, Facebook employs more than just the use of secure software to keep out criminals who supply the darknet markets with stolen information.

Facebook buys the leaked passwords from the hackers in the various darknet markets, cross-reference them with existing user passwords, then sends an alert to their users to reset their passwords or make them a lot stronger to ensure their account’s safety.

Cross-referencing Process is Heavy

alex-stamos

Facebook purchases stolen passwords from hackers on various darknet markets and uses them to improve their users’ online safety.

Facebook’s Chief of Security Alex Stamos admits that the process is not easy at all, but is very effective.

He mentioned that the biggest threat to the safety of user accounts is weak passwords and the reusing of passwords.

He highlights that, despite the security team’s efforts to keep Facebook secure from hackers looking to make a coin on darknet markets, ensuring user accounts safety is an entirely different and notably more difficult aspect.

Facebook’s security team apparently began their data mining venture shortly after the massive data breach of Adobe in 2013.

Their primary goal was to seek out users with weak, reused passwords that were shared on the Facebook and the Adobe platform.

Since then, they have continued to purchase leaked passwords from the various darknet markets in a bid to ensure their users’ continued safety.

Passwords are Secure

For those who are concerned about their passwords being accessed by the Facebook security team, Facebook security incident response manager assures them that the method used to cross-reference the passwords to the respective owners’ accounts is in no way similar.

At the time they began buying the passwords from darknet markets, they ran the plaintext passwords using a one-way hashing code in order to link the passwords to their respective accounts.

The one-way hashing function compares the hashes of the recovered password using hashes that are already stored by Facebook.

If the two hashes are successfully matched using Facebook’s security process, then Facebook identifies the user and sends them a request to change their password in order to enhance account security.

Facebook’s Move May Be Encouraging Cyber-crime

As expected, there has been outcry concerning the morality of the whole situation.

Purchasing stolen information from cyber-criminals in the various darknet markets could only promote their activities, especially now that they realize Facebook will simply pay them to return the stolen passwords.

Stamos admits that the use of passwords and usernames are more than a bit outdated.

Originally coined in the 70s by mainframe architectures, the security provided by them is less than sufficient.

This is mostly the reason why Facebook later adopted additional security measures such as the identification of Facebook friends alongside its original two-factor authentication process to determine whether an account had been compromised.

They have also enhanced the account recovery significantly by making it possible to allow close friends to help in the verification of your account recovery request.

Stamos insists that despite all the security measures they use to protect their users from cybercriminals, there is always the lot that will choose to skip these measures and as such, it is upon the security team to ensure their account security.

Author:  Darknet Markets

Source:  https://darkwebnews.com/darknet-markets/facebook-buys-leaked-passwords-darknet-markets

Monday, 12 December 2016 06:25

Battle of the Secure Smartphones

Dark web users just love having their smartphone communications spied on, right? (Detect any sarcasm there?)

While no internet-connected device is 100% secure, some definitely are more armored than others.  In the smartphone arena, several phones consistently rank among the best.

To which ones might I be referring? The Kali Linux NetHunter 3.0, Copperhead OS, and Blackphone 2 are a few favorites. Obviously these devices can be used by more than just those who explore the dark web, but if you’re someone who does, a little protection can’t hurt, right?

Kali Linux NetHunter 3.0

kali-nethunter-n10-2

Those of you in the pen testing field already know the Kali Linux name. Given that it’s almost synonymous with security, I expected nothing less of the latest NetHunter distro.

NetHunter was created as a joint effort between Kali community member “BinkyBear” and Offensive Security. Specifically, it’s compatible with Nexus devices, including Nexus 5, Nexus 6, Nexus 7, Nexus 9, Nexus 10, and OnePlusOne.

NetHunter 3.0 supports the following attacks (and tools):

  1. Wireless 802.11 frame injection and AP mode support
  2. USB HID Keyboard Attacks
  3. BadUSB MITM Attacks
  4. Full Kali Linux toolset
  5. USB Y-cable support in the Nethunter kernel
  6. Software-defined radio support

In addition, the NetHunter features many of the pentesting weapons that its desktop counterpart has in its arsenal, such as Aircrack-ng (a suite of tools to assess Wi-Fi network security); BBQSQL, which simplifies blind SQL injection tests; and Ghost Phisher, a wireless and Ethernet security auditing and attack software program.

So, great, I have an armory of “attack” tools, but can the NetHunter defend itself? Of course!

While it includes a number of tools for defensive purposes as well, one of its more impressive features is the ability to cause the  Linux Unified Key Setup (LUKS) to self-destruct, a.k.a. a “nuke option.” (This feature is available on its other platforms as well).

The self-destruct process hasn’t officially been implemented yet, but if you’re already a Kali user and would like to try it out, there are more detailed instructions here: Testing the LUKS Nuke Patch

Beyond this, the same tools that can be used to simulate attacks against other systems could be used to find vulnerabilities on yours.

As with the versions of Kali on other platforms, however, I wouldn’t recommend them to a beginner with Linux distros. It’s possible that it could either prove frustrating, or you might do something with it that you don’t intend. Its tools have a specific user base in mind.

One possible security issue to note: if you’re using an older device (such as the Nexus 7) with Nethunter 3.0, it may have trouble encrypting the device.

A collaborator named jmingov on Github suggested the following fix for this problem:

kalilinux_issue.png

“Atm, the easiest way is to remove the chroot from Nethunter app, encrypt the device, and then reinstall the chroot.” (For the full conversation, see GitHub: Kali NetHunter issues.)

Overall, however, it seems to be well protected.

Copperhead OS

secure_android_phone-86fb917c460bbea15bf5dc43e74c3480

Like the venomous snake that inspired its name, CopperheadOS is an Android operating system you don’t want to mess with.

Though it isn’t necessarily designed for pentesting like the NetHunter, it features its fair share of security attributes as well. Want to know more? Here are just a few examples:

  • Full Disk Encryption (FDE) at the filesystem layer securing all data with AES-256-XTS and all metadata with AES-256-CBC+XTS.
  • Full verified boot for all firmware and partitions of the OS. Unverified partitions containing user data are wiped via a factory reset.
  • App permission model allows ability to revoke permissions and supply false data

CopperheadOS  also uses the Zygote service to launch apps using the fork and exec commands, as opposed to Android, which only uses fork. The main purpose of this change is to add buffer overflow protection.

In addition, the system makes it simple to adjust security levels. A slider located under Settings->Security->Advanced enables you to balance performance speed with security. The slider starts at 50% by default, but can easily be changed when necessary. Besides using the slider, all of the security settings can be changed manually, if that’s your preference.

Beyond these basic features, CopperheadOS offers:

  • Hardened allocator: CopperheadOS replaces the standard system allocator with a port of OpenBSD’s malloc implementation, i.e. it basically manages the memory.
  • Protection from zero-day exploits: It patches up many vulnerabilities and makes it more difficult for an attacker to gain access
  • Improved sandboxing and isolation for apps and services: A stricter set of policies guides the SELinux security engine, and apps are sandboxed according to seccomp-bpf.

This is only a very basic summary of the features, but more or less, CopperheadOS is saying, “Go ahead. Attack me. I dare you.”

Despite its numerous protections, it is still possible that Copperhead may have security issues under certain circumstances.

A user on Twitter recently asked about being able to install Open GApps onto his device, and CopperheadOS replied, “Sideloading stuff like opengapps compromises the security of the system by requiring an insecure recovery and no verified boot.”

It sounds as though installing certain types of apps like this one can have unintended consequences for the security of the OS. It’s for this reason that the developers always recommend that you always use F-Droid to download apps.

Blackphone 2.0

Angle-frontBack_homescreen_v3.png

You may already know the name Blackphone – like Kali Linux, its brand conjures images of Mr. Robot-like scenarios, where privacy is of the utmost importance.

One of Blackphone 2’s basic features that offers protection right off the bat is its Security Center, which sits at the bottom-right of the home screen. In this area, you can configure how much access individual apps have to any of your data.

The Security Center gives you the ability to seclude apps and services from one another, while still giving you an all-inclusive overview of your phone’s features.

Beyond this, it also offers a feature called Spaces, which gives you the ability to build remote, secure areas within the system. It’s very much akin to the way that Qubes OS sandboxes its virtual machines so that they have limited access to one another.

Yet another valuable tool is Blackphone’s Remote Wipe feature, which lets you power off your phone, kill specific apps, or even completely wipe the device (in case of theft or loss). You can set up these features through the Blackphone Remote Access page. Of course, this can only be accessed with a passphrase, so as I always say, make sure it’s a strong passphrase, and one that you won’t forget!

By the way, part of configuring the remote wipe process is giving your phone a name. I think I’d name mine “Nick Fury.”

Unfortunately, like any OS, Blackphone 2.0 is not without its flaws. On January 6, 2016, ZDNet featured an article entitled Severe Silent Circle Blackphone vulnerability lets hackers take over, in which they explained that there was a socket left open on Blackphone 1 that’s also used by SELinux on Android.

Specifically, they found certain apps that interact with said socket, in particular agps_daemon, which has, as they put it, “…more elevated privileges than a normal shell/app user since it is a system/radio user.” It appears that they’re still searching for a solution to this problem.

That aside, the Blackphone 2, for the most part, is still a pretty hardened device, and has stronger defenses than most phones of its type.

You Blew My Cover!

As with any high-security technology, these phones (and the accompanying OS’s) may take some getting used to. Plus, in spite of their numerous security features, the user has to take steps to make sure that the private info inside remains just that…private.

That being said, I’d recommend them to any average citizen who places a high value on confidentiality. And maybe to anyone who likes a “vodka martini, shaken, not stirred.”

Source : https://www.deepdotweb.com/2016/11/02/battle-secure-smartphones

POSTED BY: CIPHAS

How can the iPad be made even better? Can this even be possible? This article endeavors to show you a few tips and tricks which will make your iPad an amazing computing machine, whether you want to use it to take high-quality video to post on Youtube or to surf the web while in the bathtub.

If you want to secure your iPad’s backups, you can do it by opening your iPad in iTunes, going to the summary tab and choosing Encrypt Data. That way, you will have all of your information saves in case something happens, and you lose all the information on your iPad.

Most people know that an SD card can be plugged into the iPad directly from their camera to view photos. Many people do not know that a connection kit can be purchased to connect your camera to your iPad. You can search the online Apple Store to find one for your camera.

Accessories

Wait until accessories are on sale until you buy. Your device comes with what you need up front – the iPad and a charging cable – and everything else can wait. If you want a standing charger, case, keyboard, screen protection or any such item, they go on sale frequently online, so keep your eyes peeled and be patient.

If you plan to take your device into the kitchen, you absolutely must have a screen protector, stylus, and stand case. You will accidentally drop goo all over your iPad, and the protector and case will ensure it says of the iPad itself. The stylus keeps sticky fingers off the screen, too.

iTunes

You can listen to your home iTunes library from your iPad. If you do not want to duplicate your songs, there is an alternative way to listen to them. Just use Home Sharing, enabled on the computer and the iPad. In the iPad’s Music app, click on More and then Shared. The next step is to enjoy your tunes!

All of the satellite navigation applications on the iPad are not free. A good substitute for this is using the Maps instead. Tap on Directions in the upper left of Maps. The iPad will then figure out your location and will direct you on each stage of your travels to your destination.

Google Search

Do you dislike Google Search on your device? It can easily be replaced with another preferred engine! Go to the menu to Settings, pick Safari and then Search Engine. Choose the search engine you want from the list of installed engines. Various search engines, such as Bing and Yahoo are available.

The final row of icons on your device will be locked no matter what page of icons you are on. This should be used for your most accessed apps. Generally, people like to put social applications, email or musical programs here. Play around with these slots to find what is best for you.

Try to use a wallpaper that is not that dark if you are worried about seeing smudges and fingerprints on the screen. They show up more prominently when the background is dark, so it would be a better idea to choose a wallpaper that is a bit lighter in color.

Different Keypad

When typing on an iPad, you have the option of using a different keypad. Typing on a tiny keyboard that is projected on a screen can be quite challenging. You can buy a Bluetooth keyboard and simply attach it to your tablet without a problem. This allows you to type with the same ease that you regularly enjoy on your laptop or desktop computer.

Is an app annoying you with notifications? In the Settings app, click on Notifications and you’ll be able to turn off any obnoxious apps, ensuring they don’t interrupt you in the future. You can set other options here as well, so it’s a good idea to check out what each app allows for.

iPad is a Great Entertainment Tool

The iPad is a great entertainment tool, but it can also be great for education. There are podcasts available for higher learning, as well as dozens of educational games and applications for younger children. There are also great math programs to help elementary children learn their multiplication or division tables.

You can take better pictures by focusing on your subject and improving the metering of the light. All you have to do is tap on your subject and your device will focus on this area and meter the light to make your subject more visible. Experiment with this tool to take some artsy picture!

While you do need to be careful with your device in general, it is surprisingly resistant. Even if you drop it, it should survive. The best way to extend this feature is to purchase an appropriate case that will give an extra layer of cushion to avoid any type of damage.

If you are struggling to see your device under bright sunlight, consider buying a matte screen protector. While the iPad’s glossy screen may look stylish, in bright sunlight, the glossy finish can have a huge amount of glare. A matte screen protector will reduce the glare from your iPad in brightly lit rooms.

Screen Protector

While it may seem like a good idea to place your iPad in a case to protect it, why would you want to hide something that you spent so much money on? It would be best to place a screen protector on it when you want to carry it around instead of concealing it in a nondescript case.

Before you purchase any apps for your device, always read some reviews. There are many iPad apps in the marketplace that are unfit for purpose, but unfortunately, Apple makes it very difficult for you to get refunds on app purchases. Therefore, it always pays to read app reviews before reaching for your credit card.

Anyone can use an iPad successfully, whether they are a baby playing with a number learning app or a senior who wants to play Tetris. Everyone and anyone in-between can find great utility in this tablet. This article has provided tips to make the iPad the best tool for anyone who wishes to own one.

Source : https://www.musttechnews.com

If, like me and my clients, you ever receive an email about a domain name expiration, proceed with great suspicion — because many of these "notices" are a sham. They're designed to sell you services you don't need or to trick you into transferring your domain name to another registrar.

Usually, the emails can safely be ignored.

Here's an example:

As shown in the image above, an important-looking email from "Domain Service" refers to a specific domain name in the subject line. The body of the email states that it is an "EXPIRATION NOTICE." However, the finer print states that the expiration is not for the domain name registration itself but instead for "search engine optimization submission" — services that the recipient of the email has never purchased (and probably doesn't want).

Many recipients of these emails likely click the payment link thinking they should do so to ensure that their domain names don't expire.

While this is obviously misleading, it isn't new.

In 2010, the U.S. Federal Trade Commission warned about these frauds in a press release titled "FTC Halts Cross Border Domain Name Registration Scam." The FTC said:

The Federal Trade Commission has permanently halted the operations of Canadian con artists who allegedly posed as domain name registrars and convinced thousands of U.S. consumers, small businesses and non-profit organizations to pay bogus bills by leading them to believe they would lose their Web site addresses unless they paid. Settlement and default judgment orders signed by the court will bar the deceptive practices in the future.
In June 2008, the FTC charged Toronto-based Internet Listing Service with sending fake invoices to small businesses and others, listing the existing domain name of the consumer's Web site or a slight variation on the domain name, such as substituting ".org" for ".com." The invoices appeared to come from the businesses' existing domain name registrar and instructed them to pay for an annual "WEBSITE ADDRESS LISTING." The invoices also claimed to include a search engine optimization service. Most consumers who received the "invoices" were led to believe that they had to pay them to maintain their registrations of domain names. Other consumers were induced to pay based on Internet Listing Service's claims that its "Search Optimization" service would "direct mass traffic" to their sites and that their "proven search engine listing service" would result in "a substantial increase in traffic."The FTC's complaint charged that most consumers who paid the defendants' invoices did not receive any domain name registration services and that the "search optimization" service did not result in increased traffic to the consumers' Web sites.

And, in 2014, ICANN issued a similar warning, "Be Careful What You Click: Alert of New Fraudulent Domain Renewal Emails." In its alert, ICANN said:

Recently, online scammers have targeted domain name registrants with a registration renewal scam in order to fraudulently obtain financial information. The scam unfolds as follows. The scammer sends an email to a domain registrant that offers an opportunity to renew a registration, and encourages the email recipient to "click here" to renew online at attractively low rates. These emails appear to be sent by ICANN. The scammers even lift ICANN's branding and logo and include these in both the body of the email message and at the fake renewal web page, where the scammers will collect any credit card or personal information that victims of the scam submit.

Here are some simple steps to avoid falling for these types of scams:

  • Check your domain name registrations to ensure that the email contacts in the "whois” records are accurate and that, in the case of domain names owned and used by companies, only current personnel educated about the domain name system are listed as contacts (because the fraudsters send their notices to contacts in the whois records).
  • Don't click on any links in a suspicious email about a domain name "expiration." These links typically contain tracking technology that enable the sender to identify the simple fact that you have clicked — which could increase the likelihood you will receive further notices or spam.
  • If you are truly concerned that a notice may be legitimate or that your domain name may be at risk of expiring, simply check its expiration date in the whois record. Then, confirm with your current registrar that the domain name is set to auto-renew (if desired) and that your payment information is accurate. If you plan to keep the domain name for a long time, consider renewing it for the longest possible term (often 10 years).
  • Set your domain name's lock status (at your registrar) to help prevent unauthorized transfers. To see whether your domain name is locked, look for a status such as "clientTransferProhibited" in the whois record.
  • And, of course, simply delete any suspicious "expiration" emails.

Author:  Doug Isenberg

Source:  http://www.circleid.com/

Sunday, 06 November 2016 14:33

Google is changing search in a big way

Google is now starting to experiment with one of its biggest changes to search. The company is beginning to test a new "mobile first" version of its search index, meaning the company will prioritize mobile content in its search results.

First, a refresher on how Google Search works: Google's bots crawl the web tracking more than 60 trillion web pages and the links within them. Google then categorizes them into a massive index based on hundreds of different factors. This index, along with a series of algorithms, is what enables Google to return relevant search results — that list of blue links — when you enter a query into the search box.

With the new update, Google will determine the rankings of pages based on their mobile content. (While it was previously reported that Google was creating an entirely separate mobile index, the company says it will be using the same index as before but that it will use mobile sites for its page ranking.)
 

"Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results," writes Google product manager Doantam Phan. 

There are a lot of implications to this change, but the most obvious one is that sites that don't have functional mobile versions will likely lose out, and turn up farther down in search results. With this move, Google's message is very clear: The time to adapt to mobile is now. 

This is a big change and one that "will take some time" to be implemented fully, according to Phan, but for users this means mobile search results will get a lot better. That's good news for users since the majority of Google searches now come from mobile devices — the impetus behind Google's desire to optimize its core product for that audience.

Though Google is still only testing the change, the company offers a few suggestions to those who want to make sure their sites are ready for the change. You can take a look at them over at Google's Webmaster blog

Source: mashable

Monday, 15 August 2016 22:09

Here Are Four Steps to Un-Google Yourself

Remember when campus police officers pepper-sprayed student protesters at the University of California, Davis, in 2011? Of course you do: A viral image of the incident spread like wildfire online, and a quick Google search turns up plenty of articles criticizing the police and administrative response to the peaceful demonstration. To save face, UC Davis Chancellor Linda Katehiissued a directive: "Get me off the Google."

davisWIKIMEDIA - WIKIMEDIA.ORG

As Katehi quickly learned, however, that's easier said than done. But it's not impossible. If you're hoping to disappear from Google search results, here are some steps you can take.

1. It's important to have a strategy.

Start by searching your name and compiling a list of sites where your personal information appears.

GoogleGOOGLE

2. OK, now you have a better sense of the task at hand.

The first thing you might notice after searching your name is that your personal and social media accounts appear front and center. Go ahead anddelete those.

FacebookFACEBOOK

3. What about sites where your personal information is beyond your control?

On data collection sites such as Spokeo or PeopleFinder, for example, you're going to have to reach out to the webmaster individually and request the deletion.

SpokeoSPOKEO - SPOKEO.COM

Sometime this matter can be easily resolved online; other times, it requires actual paperwork. It depends on the site, CNET reported. To expedite the process, you can pay third-party services such as DeleteMe to do the legwork for you.

4. Alternatively, you can ask Google to remove URLs from its search listings.

It's not guaranteed that the search engine will comply with your request (the site would've had to violate a company policy to warrant deletion). But if a webmaster isn't responsive, it's one option that's available to you.

google chrome

Per Google policy, the company will remove personal information that falls under these categories: national identification numbers (e.g. your social security number), bank account numbers, credit card numbers, images of your signature, and "[nude] or sexually explicit images that were uploaded or shared without your consent."

As a general rule, Google won't remove URLs from its search results that contain your date of birth, address, or telephone number. But, as the company said, it will "apply this policy on a case-by-case basis."

Source : attn.com

Page 4 of 701

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media