fbpx

LastPass' new Security Dashboard gives users a complete picture of their online security

Knowing if your passwords have been leaked online is an important step to protecting your online accounts which is why LastPass has unveiled a new Security Dashboard which provides end users with a complete overview of the security of their online accounts.

The company's new Security Dashboard builds on last year's LastPass Security Challenge, which analyzed users' stored passwords and provided a score based on how secure they were, by adding dark web monitoring. The new feature is available to LastPass Premium, Families and Business customers and it proactively watches for breach activity and alerts users when they need to take action.

 

In addition to showing users their weak and reused passwords, the new Security Dashboard now gives all LastPass users a complete picture of their online security to help them regain control over their digital life and know that their accounts are protected.

Dark web monitoring

According to a recent survey of more than 3,000 global consumers conducted by LastPass, 40 percent of users don't know what the dark web is. The majority (86%) of those surveyed claimed they have no way of even knowing if their information is on the dark web.

LastPass' new dark web monitoring feature proactively checks email addresses and usernames against Enzoic’s database of breached credentials. If an email address is found in this 3rd party database, users will be notified immediately via email and by a message in their LastPass Security Dashboard. Users will then be prompted to update the password for that compromised account.

Vice president of product management, IAM at LogMeIn, Dan DeMichele explained why LastPass decided to add dark web monitoring to its password manager in a press release, saying:

“It’s extremely important to be informed of ways to protect your identity if your login, financial or personal information is compromised. Adding dark web monitoring and alerting into our Security Dashboard was a no brainer for us. LastPass already takes care of your passwords, and now you can extend that protection to more parts of your digital life. LastPass is now equipped to truly be your home for managing your online security – making it simple to take action and stay safe in an increasingly digital world. With LastPass all your critical information is safe so you can access it whenever and wherever you need to.”

[Source: This article was published in techradar.com By Anthony Spadafora - Uploaded by the Association Member: Anna K. Sasaki]

Categorized in Internet Privacy

Privacy on the internet is very important for many users, to achieve this they resort to TOR or a VPN. ButWhich is better? What are the advantages of using one or the other? In today’s article we are going to see in detail all the advantages and disadvantages that both have.

If we talk about internet privacy, generally the common people do not pay much attention to it. They have all their data in their Google accounts, they log in anywhere, their social networks are not configured to protect their privacy.

We could be giving examples all day. But what can happen if I expose my data in this way? The simple answer? Anything.

From attacks by cybercriminals, to the surveillance of different government agencies, limitation of access to websites, etc. Anything can happen, since information is one of the most powerful tools you can give to a company or individual.

 

When we surf the internet in a normal way, so to speak, we are never doing it anonymously. Even the incognito mode of the most popular browsers is not an effective method to achieve this.

It is precisely by this method that many users decide use a VPN or browse through Tor. The two systems are very good for browsing the internet anonymously, although their differences are notorious and we will mention them below.

Main advantages of using a VPN network

Explaining the operation of a VPN network is quite simple: it adds a private network to our connection. In short, the VPN network takes our connection, takes care encrypt it and then send it to the destination server.

The way it works is too simple, at least in a basic way. Instead of directly entering a website, we first go through an intermediate server and then enter the destination site through this intermediate server.

Using a VPN network is highly recommended for those who connect to the internet from public WiFi networks. Also, one of the great advantages it has is that you can camouflage your real location.

Let’s pretend you are in Argentina, but the VPN server works in the United States. All the websites you access will believe that you are precisely in the United States. Which comes in handy to bypass any kind of content blocking on the internet.

 

Main advantages of using Tor

The idea of ​​Tor is to keep the user anonymous at all times when browsing the internet. To get it, our information passes between a large number of nodes before we can see the website. In this way, it is not possible to determine our location and our connection information such as IP.

Although, it is a reliable system that improves our privacy on the internet. In reality, browsing completely anonymously is not possible and neither is it in Tor. Since, in the final node the data is decrypted to be able to access the site in question. Yes we are exposed although it is much more complicated for them to find out something about us. Tor takes care of that.

When we use Tor, we are much more secure than when using any common browser. But you must bear in mind that it is not an infallible system. Although we will be much safer when visiting websites with secure connections (HTTPS) than in sites that do not have encryption activated.

A very important extra that you should always keep in mind is that: if the website is not secure, that is, it is not encrypted (HTTPS), do not enter any kind of information to it. By this we mean login information, email, bank accounts, credit cards, etc.

Tor vs VPN Which one should you use?

The first thing you should know is that most quality VPNs are paid. In the case of Tor, this is totally free and we will not have to pay absolutely anything at any time.

Another thing to keep in mind is that VPN services do store user data for obvious reasons. Anonymity is lost this way, especially if they had to face the law.

In the case of Tor this does not happen, the only problem with the latter is that the browsing speed is not exactly the bestregardless of the speed of your connection.

The bottom line is pretty simple: If you are an average user who is concerned about how companies use your private data, then it is best to use a VPN network. This will be faster than Tor which will allow us to consume multimedia content without any kind of problem.

In the case of Tor, it is used for those people who need a lot of anonymity on the internet. It is something quite common that we see in people who have to face governments. Like the case of different journalists in Venezuela, to give an example.

The differences between Tor and a VPN network are quite clear. Each one is used for something slightly different, the two promise anonymity. But you must bear in mind that long-term and total anonymity on the internet does not exist.

[Source: This article was published in explica.co - Uploaded by the Association Member: Anthony Frank] 

Categorized in Internet Privacy

“For me, trust has to be earned. It’s not something that can be demanded or pulled out of a drawer and handed over. And the more government or the business sector shows genuine regard and respect for peoples’ privacy in their actions, as well as in their word and policies, the more that trust will come into being.” Dr. Anita L. Allen

Dr. Anita Allen serves as Vice Provost for Faculty and Henry R. Silverman Professor of Law and Philosophy at the University of Pennsylvania. Dr. Allen is a renowned expert in the areas of privacy, data protection, ethics, bioethics, and higher education, having authored the first casebook on privacy law and has been awarded numerous accolades and fellowships for her work. She earned her JD from Harvard and both her Ph.D. and master’s in philosophy from the University of Michigan. I had the opportunity to speak with her recently about her illustrious career, the origins of American privacy law and her predictions about the information age.

 

Q: Dr. Allen, a few years ago you spoke to the Aspen Institute and offered a prediction that “our grandchildren will resurrect privacy from a shallow grave just in time to secure the freedom, fairness, democracy, and dignity we all value… a longing for solitude and independence of mind and confidentiality…” Do you still feel that way, and if so, what will be the motivating factors for reclaiming those sacred principles?

 

A: Yes, I believe that very hopeful prediction will come true because there’s an increasing sense in the general public of the extent to which we have perhaps unwittingly ceded our privacy controls to the corporate sector, and in addition to that, to the government. I think the Facebook problems that had been so much in the news around Cambridge Analytica have made us sensitive and aware of the fact that we are, by simply doing things we enjoy, like communicating with friends on social media, putting our lives in the hands of strangers.

And so, these kinds of disclosures, whether they’re going to be on Facebook or some other social media business, are going to drive the next generation to be more cautious. They’ll be circumspect about how they manage their personal information, leading to, I hope, eventually, a redoubled effort to ensure our laws and policies are respectful of personal privacy.

Q: Perhaps the next generation heeds the wisdom of their elders and avoids the career pitfalls and reputational consequences of exposing too much on the internet?

A: I do think that’s it as well. Your original question was about my prediction that the future would see a restoration of concern about privacy. I believe that, yes, as experience shows the younger generation just what the consequences are of living your life in the public view and there will be a turnaround to some extent. To get people to focus on what they have to lose. It’s not just that you could lose job opportunities. You could lose school admissions. You could lose relationship opportunities and the ability to find the right partner because your reputation is so horrible on social media.

All of those consequences are causing people to be a little more reserved. It may lead to a big turnaround when people finally get enough control over their understanding of those consequences that they activate their political and governmental institutions to do better by them.

Q: While our right to privacy isn’t explicitly stated in the U.S. Constitution, it’s reasonably inferred from the language in the amendments. Yet today, “the right to be forgotten” is an uphill battle. Some bad actors brazenly disregard a “right to be left alone,” as defined by Justice Brandeis in 1890. Is legislation insufficient to protect privacy in the Information Age, or is the fault on the part of law enforcement and the courts?

A: I’ve had the distinct pleasure to follow developments in privacy law pretty carefully for the last 20 years, now approaching 30, and am the author or co-author of numerous textbooks on the right to privacy in the law, and so I’m familiar with the legal landscape. I can say from that familiarity that the measures we have in place right now are not adequate. It’s because the vast majority of our privacy laws were written literally before the internet, and in some cases in the late 1980s or early 1990s or early 2000s as the world was vastly evolving. So yes, we do need to go back and refresh our electronic communications and children’s internet privacy laws. We need to rethink our health privacy laws constantly. And all of our privacy laws need to be updated to reflect existing practices and technologies.

 

The right to be forgotten, which is a right described today as a new right created by the power of Google, is an old right that goes back to the beginning of privacy law. Even in the early 20th century, people were concerned about whether or not dated, but true information about people could be republished. So, it’s not a new question, but it has a new shape. It would be wonderful if our laws and our common law could be rewritten so that the contemporary versions of old problems, and completely new issues brought on by global technologies, could be rethought in light of current realities.

Q: The Fourth Amendment to the Constitution was intended to protect Americans from warrantless search and seizure. However, for much of our history, citizens have observed as surveillance has become politically charged and easily abused. How would our founders balance the need for privacy, national security, and the rule of law today?

A: The fourth amendment is an amazing provision that protects persons from a warrantless search and seizure. It was designed to protect peoples’ correspondence, letters, papers, as well as business documents from disclosure without a warrant. The idea of the government collecting or disclosing sensitive personal information about us was the same then as it is now. The fact that it’s much more efficient to collect information could be described as almost a legal technicality as opposed to a fundamental shift.

I think that while the founding generation couldn’t imagine the fastest computers we all have on our wrists and our desktops today, they could understand entirely the idea that a person’s thoughts and conduct would be placed under government scrutiny. They could see that people would be punished by virtue of government taking advantage of access to documents never intended for them to see. So, I think they could very much appreciate the problem and why it’s so important that we do something to restore some sense of balance between the state and the individual.

Q: Then, those amendments perhaps anticipated some of today’s challenges?

A: Sure. Not in the abstract, but think of it in the concrete. If we go back to the 18th and 19th centuries, you will find some theorists speculating that someday there will be new inventions that will raise these types of issues. Warren and Brandeis talked specifically about new inventions and business methods. So, it’s never been far from the imagination of our legal minds that more opportunities would come through technology. They anticipated technologies that would do the kinds of things once only done with pen and paper, things that can now be done in cars and with computers. It’s a structurally identical problem. And so, while I do think our laws could be easily updated, including our constitutional laws, the constitutional principles are beautiful in part because fundamentally they do continue to apply even though times have changed quite a bit.

Some of the constitutional languages we find in other countries around ideas like human dignity, which is now applied to privacy regulations, shows that, to some extent, very general constitutional language can be put to other purposes.

Q: In a speech to the 40th International Data Protection and Privacy Commissioners Conference, you posited that “Every person in every professional relationship, every financial transaction and every democratic institution thrives on trust. Openly embracing ethical standards and consistently living up to them remains the most reliable ways individuals and businesses can earn the respect upon which all else depends.” How do you facilitate trust, ethics, and morality in societies that have lost confidence in the authority of their institutions and have even begun to question their legitimacy?

A: For me, trust has to be earned. It’s not something that can be demanded or pulled out of a drawer and handed over. Unfortunately, the more draconian and unreasonable state actors behave respecting people’s privacy, the less people will be able to generate the kind of trust that’s needed. And the more government or the business sector shows genuine regard and respect for peoples’ privacy in their actions, as well as in their word and policies, the more that trust will come into being.

I think that people have to begin to act in ways that make trust possible. I have to act in ways that make trust possible by behaving respectfully towards my neighbors, my family members, and my colleagues at work, and they the same toward me. The businesses that we deal with have to act in ways that are suggestive of respect for their customers and their vendors. Up and down the chain. That’s what I think. There’s no magic formula, but I do think there’s some room for conversation for education in schools, in religious organizations, in NGOs, and policy bodies. There is room for conversations that enable people to find discourses about privacy, confidentiality, data protection that can be used when people demonstrate that they want to begin to talk together about the importance of respect for these standards.

It’s surprising to me how often I’m asked to define privacy or define data protection. When we’re at the point where experts in the field have to be asked to give definitions of key concepts, we’re, of course, at a point where it’s going to be hard to have conversations that can develop trust around these ideas. That’s because people are not always even talking about the same thing. Or they don’t even know what to talk about under the rubric. We’re in the very early days of being able to generate trust around data protection, artificial intelligence, and the like because it’s just too new.

Q: The technology is new, but the principles are almost ancient, aren’t they?

A: Exactly. If we have clear conceptions about what we’re concerned about, whether its data protection or what we mean by artificial intelligence, then those ancient principles can be applied to new situations effectively.

Q: In a world where people have a little less shame about conduct, doesn’t that somehow impact the general population’s view of the exploitation of our data?

A: It seems to me we have entered a phase where there’s less shame, but a lot of that’s OK because I think we can all agree that maybe in the past, we were a bit too ashamed of our sexuality, of our opinions. Being able to express ourselves freely is a good thing. I guess I’m not sure yet on where we are going because I’m thinking about, even like 50 years ago, when it would have been seen as uncouth to go out in public without your hat and gloves. We have to be careful that we don’t think that everything that happens that’s revealing is necessarily wrong in some absolute sense.

 

It’s different to be sure. But what’s a matter of not wearing your hat and gloves, and what’s a matter of demeaning yourself? I certainly have been a strong advocate for moralizing about privacy and trying to get people to be more reserved and less willing to disclose when it comes to demeaning oneself. And I constantly use the example of Anthony Weiner as someone who, in public life, went too far, and not only disclosed but demeaned himself in the process. We do want to take precautions against that. But if it’s just a matter of, “we used to wear white gloves to Sunday school, and now we don’t…” If that’s what we’re talking about, then it’s not that important.

Q: You studied dance in college and then practiced law after graduating from Harvard, but ultimately decided to dedicate your career to higher education, writing, and consulting. What inspired you to pursue an academic career, and what would you say are the lasting rewards?

A: I think a love of reading and ideas guided my career. Reading, writing, and ideas, and independence governed my choices. As an academic, I get to be far freer than many employees are. I get to write what I want to write, to think about what I want to think, and to teach and to engage people in ideas, in university, and outside the university. Those things governed my choices.

I loved being a practicing lawyer, but you have to think about and deal with whatever problems the clients bring to you. You don’t always have that freedom of choice of topic to focus on. Then when it comes to things like dance or the arts, well, I love the arts, but I think I’ve always felt a little frustrated about the inability to make writing and debate sort of central to those activities. I think I am more of a person of the mind than a person of the body ultimately.

 

[Source: This article was published in cpomagazine.com By RAFAEL MOSCATEL - Uploaded by the Association Member: Grace Irwin]

Categorized in Internet Ethics

As we close out 2019, we at Security Boulevard wanted to highlight the five most popular articles of the year. Following is the fifth in our weeklong series of the Best of 2019.

Privacy. We all know what it is, but in today’s fully connected society can anyone actually have it?

For many years, it seemed the answer was no. We didn’t care about privacy. We were so enamored with Web 2.0, the growth of smartphones, GPS satnav, instant updates from our friends and the like that we seemed to not care about privacy. But while industry professionals argued the company was collecting too much private information, Facebook CEO Mark Zuckerberg understood the vast majority of Facebook users were not as concerned. He said in a 2011 Charlie Rose interview, “So the question isn’t what do we want to know about people. It’s what do people want to tell about themselves?”

 

In the past, it would be perfectly normal for a private company to collect personal, sensitive data in exchange for free services. Further, privacy advocates were almost criticized for being alarmist and unrealistic. Reflecting this position, Scott McNealy, then-CEO of Sun Micro­systems, infamously said at the turn of the millennium, “You have zero privacy anyway. Get over it.”

And for another decade or two, we did. Privacy concerns were debated; however, serious action on the part of corporations and governments seemed moot. Ten years ago, the Payment Card Industry Security Standards Council had the only meaningful data security standard, ostensibly imposed by payment card issuers against processors and users to avoid fraud.

Our attitudes have shifted since then. Expecting data privacy is now seen by society as perfectly normal. We are thinking about digital privacy like we did about personal privacy in the ’60s, before the era of hand-held computers.

So, what happened? Why does society now expect digital privacy? Especially in the U.S., where privacy under the law is not so much a fundamental right as a tort? There are a number of factors, of course. But let’s consider three: a data breach that gained national attention, an international elevation of privacy rights and growing frustration with lax privacy regulations.

Our shift in the U.S. toward expecting more privacy started accelerating in December 2013 when Target experienced a headline-gathering data breach. The termination of the then-CEO and the subsequent following-year staggering operating loss, allegedly due to customer dissatisfaction and reputation erosion from this incident, got the boardroom’s attention. Now, data privacy and security are chief strategic concerns.

On the international stage, the European Union started experimenting with data privacy legislation in 1995. Directive 95/46/EC required national data protection authorities to explore data protection certification. This resulted in an opinion issued in 2011 which, through a series of opinions and other actions, resulted in the General Data Protection Regulation (GDPR) entering force in 2016. This timeline is well-documented on the European Data Protection Supervisor’s website.

It wasn’t until 2018, however, when we noticed GDPR’s fundamental privacy changes. Starting then, websites that collected personal data had to notify visitors and ask for permission first. Notice the pop-ups everywhere asking for permission to store cookies? That’s a byproduct of the GDPR.

What happened after that? Within a few short years, many local governments in the U.S. became more and more frustrated with the lack of privacy progress at the national level. GDPR was front and center, with several lawsuits filed against high-profile companies that allegedly failed to comply.

As the GDPR demonstrated the possible outcomes of serious privacy regulation, smaller governments passed such legislation. The State of California passed the California Consumer Privacy Act and—almost simultaneously—the State of New York passed the Personal Privacy Protection Law. Both of these legislations give U.S. citizens significantly more privacy protection than any under U.S. law. And not just to state residents, but also to other U.S. citizens whose personal data is accessed or stored in those states.

Without question, we as a society have changed course. The unfettered internet has had its day. Going forward, more and more private companies will be subject to increasingly demanding privacy legislation.

Is this a bad thing? Something nefarious? Probably not. Just as we have always expected privacy in our physical lives, we now expect privacy in our digital lives as well. And businesses are adjusting toward our expectations.

One visible adjustment is more disclosure about exactly what private data a business collects and why. Privacy policies are easier to understand, as well as more comprehensive. Most websites warn visitors about the storage of private data in “cookies.” Many sites additionally grant visitors the ability to turn off such cookies except those technically necessary for the site’s operation.

Another visible adjustment is the widespread use of multi-factor authentication. Many sites, especially those involving credit, finance or shopping, validate login with a token sent by email, text or voice. These sites then verify the authorized user is logging in, which helps avoid leaking private data.

Perhaps the biggest adjustment is not visible: encryption of private data. More businesses now operate on otherwise meaningless cipher substitutes (the output of an encryption function) in place of sensitive data such as customer account numbers, birth dates, email or street addresses, member names and so on. This protects customers from breaches where private data is exploited via an all-too-common breach.

Respecting privacy is now the norm. Companies that show this respect will be rewarded for doing so. Those that allegedly don’t, however, may experience a different fiscal outcome.

 

[Source: This article was published in securityboulevard.com By Jason Paul Kazarian - Uploaded by the Association Member: Jason Paul Kazarian]

Categorized in Internet Ethics

PimEyes markets its service as a tool to protect privacy and the misuse of images

Ever wondered where you appear on the internet? Now, a facial recognition website claims you can upload a picture of anyone and the site will find that same person’s images all around the internet.

PimEyes, a Polish facial recognition website, is a free tool that allows anyone to upload a photo of a person’s face and find more images of that person from publicly accessible websites like Tumblr, YouTube, WordPress blogs, and news outlets.

In essence, it’s not so different from the service provided by Clearview AI, which is currently being used by police and law enforcement agencies around the world. PimEyes’ facial recognition engine doesn’t seem as powerful as Clearview AI’s app is supposed to be. And unlike Clearview AI, it does not scrape most social media sites.

PimEyes markets its service as a tool to protect privacy and the misuse of images. But there’s no guarantee that someone will upload their own face, making it equally powerful for anyone trying to stalk someone else. The company did not respond to a request for comment.

 

PimEyes monetizes facial recognition by charging for a premium tier, which allows users to see which websites are hosting images of their faces and gives them the ability to set alerts for when new images are uploaded. The PimEyes premium tiers also allow up to 25 saved alerts, meaning one person could be alerted to newly uploaded images of up to 25 people across the internet. PimEyes has also opened up its service for developers to search its database, with pricing for up to 100 million searches per month.

Facial recognition search sites are rare but not new. In 2016, Russian tech company NtechLab launched FindFace, which offered similar search functionality, until shutting it down in a pivot to state surveillance. Founders described it as a way to find women a person wanted to date.

“You could just upload a photo of a movie star you like, or your ex, and then find 10 girls who look similar to her and send them messages,” cofounder Alexander Kabakov told The Guardian.

The PimEyes premium tiers also allow up to 25 saved alerts, meaning one person could be alerted to newly uploaded images of up to 25 people across the internet.

While Google’s reverse image search also has some capability to find similar faces, it doesn’t use specific facial recognition technology, the company told OneZero earlier this year.

“Search term suggestions rely on aggregate metadata associated with images on the web that are similar to the same composition, background, and non-biometric attributes of a particular image,” a company spokesperson wrote in February. If you upload a photo of yourself with a blank background, for example, Google may surface similarly composed portraits of other people who look nothing like you.

PimEyes also writes on its website that it has special contracts available for law enforcement that can search “darknet websites,” and its algorithms are also built into at least one other company’s application. PimEyes works with Paliscope, software aimed at law enforcement investigators, to provide facial recognition inside documents and videos. Paliscope says it has recently partnered with 4theOne Foundation, which seeks to find and recover trafficked children.

There are still many open questions about PimEyes, like exactly how it obtains data on people’s faces, its contracts with law enforcement, and the accuracy of its algorithms.

PimEyes markets itself as a solution for customers worried about where their photos appear online. The company suggests contacting websites where images are hosted and asking them to remove images. But because anyone can search for anyone, services like PimEyes may generate more privacy issues than they solve.

 

[Source: This article was published in onezero.medium.com By Dave Gershgorn - Uploaded by the Association Member: Grace Irwin]

Categorized in Search Engine

Privacy-preserving AI techniques could allow researchers to extract insights from sensitive data if cost and complexity barriers can be overcome. But as the concept of privacy-preserving artificial intelligence matures, so do data volumes and complexity. This year, the size of the digital universe could hit 44 zettabytes, according to the World Economic Forum. That sum is 40 times more bytes than the number of stars in the observable universe. And by 2025, IDC projects that number could nearly double.

More Data, More Privacy Problems

While the explosion in data volume, together with declining computation costs, has driven interest in artificial intelligence, a significant portion of data poses potential privacy and cybersecurity questions. Regulatory and cybersecurity issues concerning data abound. AI researchers are constrained by data quality and availability. Databases that would enable them, for instance, to shed light on common diseases or stamp out financial fraud — an estimated $5 trillion global problem — are difficult to obtain. Conversely, innocuous datasets like ImageNet have driven machine learning advances because they are freely available.

 

A traditional strategy to protect sensitive data is to anonymize it, stripping out confidential information. “Most of the privacy regulations have a clause that permits sufficiently anonymizing it instead of deleting data at request,” said Lisa Donchak, associate partner at McKinsey.

But the catch is, the explosion of data makes the task of re-identifying individuals in masked datasets progressively easier. The goal of protecting privacy is getting “harder and harder to solve because there are so many data snippets available,” said Zulfikar Ramzan, chief technology officer at RSA.

The Internet of Things (IoT) complicates the picture. Connected sensors, found in everything from surveillance cameras to industrial plants to fitness trackers, collect troves of sensitive data. With the appropriate privacy protections in place, such data could be a gold mine for AI research. But security and privacy concerns stand in the way.

Addressing such hurdles requires two things. First, a framework providing user controls and rights on the front-end protects data coming into a database. “That includes specifying who has access to my data and for what purpose,” said Casimir Wierzynski, senior director of AI products at Intel. Second, it requires sufficient data protection, including encrypting data while it is at rest or in transit. The latter is arguably a thornier challenge.

[Source: This article was published in urgentcomm.com By Brian Buntz - Uploaded by the Association Member: Bridget Miller]

Categorized in Internet Privacy

New search engine Kilos is rapidly gaining traction on the dark web for its extensive index that allows users access to numerous dark web marketplaces.

A new search engine for the dark webKilos, has quickly become a favorite among cybercriminals and here’s why.

It all began when the dark web search engine, Grams, launched in April 2014. Grams was an instant hit, proving useful not only to researchers but cybercriminals too.

The search engine used custom APIs to scrape some of the most prominent cybercriminal markets at the time. These include AlphaBayDream Market, and Hansa.

In addition to helping searchers find an illicit product using simple search terms, Grams also provided Helix, a Bitcoin mixer service. That way, users can conveniently hide their transactions on the platform.

 

Yes, Grams was a revolutionary tool for cybercriminals on the dark web. But, it’s index was still relatively limited.

In a Wired interview, an administrator stated that the team behind Grams didn’t have the capabilities to crawl the whole darknet yet. So, they had to create an automated site submitter for publishers to submit their site and get listed on the search engine.

Despite Grams’ success, it would not remain for long. In 2017, the administrators shut down the search engine’s indexing ability and took the site down.

However, a new search engine would eventually rise to take Grams’ place two years later.

Kilos Became the Favorite Search Engine on the Dark Web

In November 2019, talks of a new dark web-based search engine called Kilos started making rounds on cybercriminal forums.

According to Digital Shadows, it’s uncertain whether Kilos has pivoted directly from Grams or if the same administrator is behind both projects. However, the initial similarities are uncanny.

For example, they both share a similar search engine-like aesthetics. Also, the naming convention remained the same, following the unit for weight or mass measurement.

Expectedly, Kilos pack more weight than Grams ever did.

Thanks to the new search engine, searchers can now perform more specific searches from a more extensive index. Kilos enable users to search across six of the top dark web marketplaces for vendors and listings.

These include CryptoniaSamsaraVersusCannaHomeCannazon, and Empire.

According to Digital Shadows, Kilos has already indexed 553,994 forum posts, 68,860 listings, 2,844 vendors, and 248,159 reviews from seven marketplace and six forums. That’s an unprecedented amount of dark web content.

What’s more, the dark web search engine appears to be improving, with the administrator introducing new updates and features. Some of these features include:

  • Direct communication between administrator and users
  • A new type of CAPTCHA to prevent automation
  • Advanced filtering system
  • Faster searches and a new advertising system
  • New Bitcoin mixer called Krumble

Kilos are gradually becoming the first stop for dark web users. From individuals looking to purchase illicit products to those searching for specific vendors, tons of users now depend on the search engine.

This could further increase the amount of data that’s available to security researchers as well as threat actors.

 

[Source: This article was published in edgy.app By Sumbo Bello - Uploaded by the Association Member: Jennifer Levin]

Categorized in Search Engine

More changes have been announced in the senior leadership of French pro-privacy search engine Qwant.

President and co-founder Eric Leandri  (pictured above) will be moving from an operational to a strategic role on January 15, the company said today — while current deputy managing director for sales and marketing, Jean-Claude Ghinozzi, is being promoted to president.

Leandri will leave the president role on January 15, although he is not departing the business entirely but will instead shift to chair a strategic and scientific committee — where he says he will focus on technology and “strategic vision.”

 

This committee will work with a new governance council, also being announced today, which will be chaired by Antoine Troesch, investment director of Qwant investor Banque des Territories, per the PR.

At the same time, Mozilla veteran Tristan Nitot  — who was only promoted to a new CEO role at Qwant in September — is returning to his prior job as VP of advocacy. Although Leandri told us that Nitot will retain the spokesman component of the CEO job, leaving Ghinozzi to focus on monetization — which he said is Qwant’s top priority now.

“[Nitot] is now executive VP in charge of communications and media,” Leandri told TechCrunch. “He has to take care of company advocacy. Because of my departure he will have now to represent Qwant in [the media]. He will be the voice of Qwant. But that position will give him not enough space and time to be the full-time CEO of the company — doing both is quite impossible. I have done that for years… but it’s very complicated.”

“We will now need to focus a lot on monetization and on our core business… to create a real ad platform,” he added, by way of explaining the latest round of exec restructuring. “This needs to have somebody in charge of doing that monetization process — that execution process of the scale of Qwant.”

Ghinozzi will be responsible for developing a “new phase” for the search engine so it can scale its business in Europe, Leandri also said, adding: “For my part I take on the strategy and the tech, and I’m a member of the board.”

The search engine company is also announcing that it’s closing a new funding round to support infrastructure and scaling — including taking in more financing from existing backers Banque des Territories and publishing giant Axel Springer — saying it expects this to be finalized next month.

Leandri would not provide details on the size of the round today, but French news website Liberation is reporting it as €10 million, citing a government source. (Per other reports in the French media, Qwant has been losing tens of millions of euros per year.)

Qwant’s co-founder did trail some “very good announcements” he said are coming imminently on the user growth front in France, related to new civil companies switching to the search engine. But again, he declined to publicly confirm full details at this stage — saying the news would be confirmed in around a week’s time.

Liberation‘s report points to this being confirmation that the French state will go ahead with making Qwant the default search engine across the administration — giving its product a boost of (likely) millions more regular users, and potentially unlocking access to more government funding.

The move by the French administration aligns with a wider push for digital sovereignty in a bid to avoid being too reliant on foreign tech giants. However, in recent months, doubt had been thrown on the government’s plan to switch wholesale from Google’s  search engine to the homegrown search alternative — after local media raised questions over the quality of Qwant’s search results.

The government has been conducting its own technical audit of Qwant’s search engine. But, per Liberation — which says it obtained an internal government memo earlier this month — the switch will go ahead, and is slated to be completed by the end of April.

Qwant has faced further uncomfortable press scrutiny on its home turf in recent months, with additional reports in French media suggesting the business has been facing a revenue crunch — after its privacy-respecting search engine generated lower than expected revenues last year.

 

On this, Leandri told us Qwant’s issue boils down to a lack of ad inventory, saying it will be Ghinozzi’s job to tackle that by making sure it can monetize more of the current impressions it’s generating — such as by focusing on serving more ads against shopping-related searches, while continuing to preserve its core privacy/non-tracking promise to users.

The business was focused last year on putting in place search engine infrastructure to prepare for scaling user growth in Europe, he suggested — meaning it was spending less time on monetizing user searches.

“We started to refocus on the monetization in November and December,” he said. “So we have lost some months in terms of monetization… Now we have started to accelerate our monetization phase and we need now to make it even better in shopping, for example.”

Leandri claims Qwant has already seen “a very good ramp up,” after turning its attention back to monetization these past two months — but says beefing up ad inventory including by signing up more ad partners and serving its own ads will now be “the focus of the company.”

“For example today on 100 queries we were sometime during the year at 20 ads, just 20% of coverage,” he told us, noting that some “iPhone 11” searches done via Qwant haven’t resulted in any ads being served to users in recent times. “We need to go to 30%-40%… We need to make it better on the shopping queries, brining new customers. We need to do all these things.

“Right now we have signed with Havas and Publicis in France for Europe but we need to ad more partners and start adding our own ads, our own shopping ads, our own technology for ads. That’s the new focus.”

Additionally, there have also been a number of reports in French media that have alleged HR problems within Qwant. Articles — such as this one by Next Inpact — have reported at length on claims by some employees that Leandri’s management style created a toxic workplace culture in which staff were subject to verbal abuse, threats and bullying.

Qwant disputes these reports but it’s notable that the co-founder is stepping back from an operation role at a time when both he and the business are facing questions over a wave of negative domestic press, and with investors also being asked to plough in fresh financing as a key strategy customer (the French government) is scrutinizing the product and the business.

The health of workplace culture at technology companies and high-pressure startups has come in for increasing attention in recent years, as workplace expectations have shifted with the generations and digital technologies have encouraged greater openness and provided outlets for people who feel unfairly treated to make their grievances more widely known.

 

Major scandals in the tech industry in recent years include Uber being publicly accused of having a sexist and bullying workplace culture by a former engineer — and, more recently, travel startup Away, whose CEO stepped down in December after a bombshell report in the press exposing a toxic culture.

[Source: This article was published in techcrunch.com By Natasha Lomas - Uploaded by the Association Member: Logan Hochstetler]

Categorized in Search Engine

With Google dominating the search engine space, you may be wondering why we need another search engine. However, giving the well-known privacy issues with Google and a couple of other major search engines like Bing and Yahoo, people are beginning to look for alternative search engines especially the ones that support their causes.

Ekoru.org is a new search engine that aims to help save our oceans by addressing two key problems, plastic pollution and CO2 levels. The promise is simple. 60% of revenue goes to partners involved with ocean cleanup and ocean reforestation. That’s right, you read correctly. Reforestation. In the ocean.

The health of our oceans is at a tipping point with unprecedented plastic pollution and damage to marine life and eco-systems. Covering 71% of our planet and containing 96% of all of our water, the health of our oceans is intrinsically linked to our future.

 

This search engine for the oceans was started by Australian expatriate Ati Bakush and Alison Lee a husband and wife team living in Malaysia. Bakush has 20 years of experience developing software for mobile operator networks and internet service providers, and Lee a former country marketing manager for Nike. Their combined expertise in technology and marketing in addition for their love for the environment resulted in the Ekoru.org search engine.

Like any search engine, as users submit their queries, related sponsored links may appear. If a user clicks on a sponsored link the website makes money which is then shared with it’s non-profit partners. Ekoru.org is partnering with Big Blue Ocean Cleanup to remove plastic and Operation Posidonia to reforest our oceans.

Operation Posidonia led by the University of New South Wales Australia, is working on reforesting the ocean by replanting seagrass. These hidden meadows of green under our oceans can trap carbon up to 40 times faster than tropical rainforests and produce oxygen concurrently. They are the unsung heroes in the fight against climate change. A blue carbon sink which is actually green!

Big Blue Ocean Cleanup is a non-profit with volunteer teams around the world that clean waters and coastlines of waste and plastic. When a whale dies of starvation with a belly full of plastic, we lose an important ally in the fight against climate change.

Whales are vital to the growth of phytoplankton which thrive when they relieve themselves and release a “poonami” in the ocean. Phytoplankton absorb the same amount of carbon as 4 Amazon forests and produce 50% of the oxygen in our atmosphere.

Ekoru’s commitment to the environment also extends to infrastructure with servers powered entirely by hydro-electricity. Each server is water-cooled eliminating the need for onboard fans, and natural airflows in the building mean no power-hungry air conditioning is required. Every search is as environmentally friendly as possible.

A strict privacy policy ensures that users concerned about their privacy have peace of mind. Every search is encrypted and private, and no data is stored on servers about user search activity.

Ekoru.org is already available as an option in some desktop and mobile browsers, such as Pale Moon and Monument. An easily installed browser extension is available for Chrome, Edge, Firefox, and Brave desktop to make it your default search engine. An Android application is available with an iOS version available soon.

 

Since launch, Ekoru.org has received a fantastic response from users around the world. Ocean lovers now have the opportunity to help their oceans through the simple act of searching. Every Ekoru search leaves a minimal carbon footprint while helping to clean and reforest our oceans. Give it a try and change your search engine to help save our oceans.

[Source: This article was published in techstartups.com - Uploaded by the Association Member: Jeremy Frink]

Categorized in Search Engine

Private.sh is a new private search engine that uses cryptography to ensure that your search history cannot be tracked by anyone – even us. Private.sh comes from the same privacy committed makers of Private Internet Access in partnership with GigaBlast – one of the few companies to have their own index of the internet available for search.

This first truly private search engine is possible thanks to the partnership, with each partner playing a key part of the puzzle to provide a new standard for cryptographically secured privacy for search engines.

Chances are, your current search engine is not private

One of the core functions and business models of a search engine is tracking searches and who makes them. The vast majority of search engines are taking personally identifiable information like your IP address and browser fingerprints to create or add to a profile. Besides keeping a log of what you’ve searched for, non-private search engines also keep a log of what result you clicked on, how long it took, whether you were feeling lucky, etc.

 

With non-private search engines, being able to identify you – the searcher – and tie your search terms to your user profile while targeting advertising at you is all an essential part of the business model.

That is to say, with normal search engines, your search’s privacy is ignored and trampled on by design. With Private.sh, your privacy is protected by design.

With Private.sh, your search privacy is protected with both encryption and anonymity.

When you enter a search term into Private.sh, the search term gets encrypted on the client side (on your computer or device) using GigaBlast’s public key, which only they can decrypt. In effect, this ensures that Private.sh never sees the search term.

After the search term is encrypted, it is passed to the GigaBlast search engine through a Private.sh proxy so GigaBlast doesn’t see your IP address, browser fingerprints, or anything that would allow for your privacy to be broken or a user profile to be created. This means that neither Private.sh or GigaBlast is able to build a user profile on you or store your search history.

Finally, the search results are encrypted by GigaBlast using your temporary public key and are returned to you through the Private.sh proxy. The results then get decrypted and rendered locally on your device using Javascript with a temporary private key that only exists on your device. This client-side keypair is changed for every search request.

With this multi-pronged approach, Private.sh is the perfect option when you need to search something privately.

Private Search is finally here, and improving

Another benefit is that since Private.sh and GigaBlast are both unable to build a user profile on you, you’ll be able to get unbiased and private search results every time. You may notice that these results aren’t as accurate or algorithmically tailored to you based on your search history. This is by design and makes Private.sh a perfect complement to your favorite search engine, for when you want to make a search that is truly private.

We will constantly be working to better your search experience without compromising your security. In the future, Private.sh will be working with GigaBlast to expand their index of the internet to bring more results. Private.sh is determined to be your private search engine of choice.

Try private search out and add it to your privacy toolkit today.

 

 [Source: This article was published in privateinternetaccess.com - Uploaded by the Association Member: Joshua Simon]

Categorized in Search Engine
Page 1 of 8

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media

Finance your Training & Certification with us - Find out how?      Learn more