fbpx

Is there a troll’s voice inside all of us at one time or another?

In today’s online climate, we are witnessing friends unfollowing or actually unfriending each other, while people are boldly insulting each other with their offensive opinions, thoughts and comments.

Research from Stanford and Cornell University suggests that under the right circumstances, we all have a troll lingering in us.

Lindsey Blackwell, a researcher of online harassment at the University of Michigan, concurs. She points out that technology has the ability to amplify our behavior, not only our best shines through online but also our worst.

Attorney Mitch Jackson, a social media leader and influencer, has experienced his share of trolls. He places them in two categories:

· Recreational trolls - Ones that are simply annoying and will eventually go away after you block them and ignore them.

· Criminal trolls - The ones you need to take more seriously, ones that are out to seriously harm you/and/or your business.

When I wrote Google Bomb with the late John Dozier, a leading internet attorney, he described trolls in ten scofflaw persona’s:

1. Pick-pocket

This is the guy who used to wait on street corners for elderly ladies to pass. He enjoys attacking defenseless people and stealing covertly using deception.

2. Wacko

We usually identify a wacko situation quickly. There are distinctive characteristics of his communications. The wacko is usually a “follower,” someone looking to gain attention and recognition, but escalates what may have started as fair criticism into more and more outrageous claims.

3. Druggie

Or, maybe “liquid courage” would be more appropriate. This guy is exactly what comes to mind. During the day this blogger is a normal guy, but at night he returns to the sanctity of his home, gets drunk or high, and goes out on the web looking for “hook-ups” and blogging on his “hang-ups.”

4. Alien

No, not from another world. But from overseas. In a far, far away place, without any treaty with the US, in a country without an effective legal system and no notion of business or personal property ownership rights.

5. Nerd

This is the guy who is scared to talk with a girl, but behind the keyboard, all alone, morphs into a Casanova. This empowerment of anonymity creates an omnipotent persona, and for the first time the nerd feels the effect of power and control, gets an adrenaline buzz when he exercises it, and he exercises it often, usually creating or perpetuating a volatile situation in which he feels he can outsmart the “opposition.”

6. Rookie

Enjoy debating a thirteen year old? They are out on the net acting like adults, posting statements and play-acting like a grown-up.

7. Sadist

 

This person attacks others, causes pain, and revels in the results in ways not worthy of mention. He loves to create, direct, control, and unleash a firestorm of criticism about you or your company just to create pain and damage.

8. Bankrupt

No, not morally bankrupt. Actually bankrupt…no money, no assets, no prospects for work, and nothing to lose.

9. Criminal

Career criminals, no less. Like the convicted felon running a sophisticated extortion scheme against a very prominent business.

10. Mis-Leader

This person is in no manner a leader. This blogger has a hidden agenda, but he just makes it sound like he is a totally objective commentator.

Invasion of Trolls

Over the past years we have watched platforms such as Huffington Post put an end to anonymous comments and NPR , Popular Science and Motherboard shut them down completely due to online commenters harassing each other and abusing the privilege of leaving comments.

These were not children polluting their sites. People leaving comments on these platforms were likely adults, yet they were acting like toddlers with a keypad — poking, teasing and harassing their playmates on a playground with no concern that they are humans too.

Where has civility gone that when you piss off a fellow parent in a carpool line you end up being trashed and trolled on social media? Want to break-up with your partner, but fear you’ll end up as a victim of e-venge? Could it be that when people are pushed to their limit they are recognizing the power of the keyboard?

Yes.

Justin Cheng, a researcher at Stanford University and lead author of the study above, wanted to better understand why trolling is so prevalent today.

“While the common knowledge is that trolls are particularly sociopathic individuals that occasionally appear in conversations, is it really just these people who are trolling others?” 


We may think we know the descriptions of trolls and even ideas of who these people are, which is usually the cesspool of the web, however the research uncovers that you or I could easily be pushed to a point of digital warfare. In a bad mood, wake-up on the wrong side of the bed, passionate about a heated trending topic and your fingers may go flying. When you see a thread that has sparked a fire of negative comments, some think it’s a green-light to pile on the insults — why not add my two-cents, everyone else is.

This gang-like trolling behavior is a “spiral of negativity”, explains Jure Leskovec, senior author of the above study.

“Just one person waking up cranky can create a spark and, because of discussion context and voting, these sparks can spiral out into cascades of bad behavior. Bad conversations lead to bad conversations. People who get down-voted come back more, comment more and comment even worse.”


There is an art to commenting when you don’t agree with posters. We should learn to be constructive, not combative in our responses.

It goes back to the old cliché many of us were taught by our grandparents and parents: “If you don’t have anything nice to say, don’t say it at all.” We need to take this advice online.

Takeaway tips:

  • When in doubt, click out.
  • Pause before posting.
  • Recognize we are all a click away from being a troll.

Source: This article was published huffingtonpost.com By Sue Scheff

Categorized in Internet Privacy

Online abuse has been a problem ever since the Internet was created. But over the past few years, it seems to have escalated—despite the efforts of platforms like Twitter and Facebook to try and control it.

And some experts believe it could get worse before it gets better.

A new report from the Pew Research Center asked more than 1,500 technologists and academics about this kind of online behavior. More than 80% of them replied that they expect public discourse online will either stay the same or get worse over the next decade.

The question asked by the researchers: "In the next decade, will public discourse online become more or less shaped by bad actors, harassment, trolls, and an overall tone of griping, distrust, and disgust?" Over 40% replied they don't expect this situation to change much, and another 39% said they could see it actually becoming more of a problem rather than less.

"Trolling will continue, while social platforms, security experts, ethicists, and others will wrangle over the best ways to balance security and privacy, freedom of speech, and user protections," Susan Etlinger, a technology analyst at the Altimeter Group, told Pew researchers.

Get Data Sheet, Fortune’s technology newsletter.

Although certain online "safe spaces" may be developed that will be free of trolls and harassment, some of the experts surveyed said that these will be little more than Potemkin villages—that is, attractive facades that hide the true nature of the social web.

 

In some cases, the research report warns an attempt to control abuse and harassment could actually result in an infringement of personal freedoms, including freedom of speech, and could lead to the web becoming less open and more polarized.

"One of the biggest challenges will be finding an appropriate balance between protecting anonymity and enforcing consequences for the abusive behavior that has been allowed to characterize online discussions for far too long," said Bailey Poland, author of the recent book Haters: Harassment, Abuse, and Violence Online.

This is a problem that Twitter in particular has struggled with for much of its life. The company's senior executives often stressed that the service was "the free-speech wing of the free-speech party," and that users should be free to say whatever they wished anonymously, and some believe that hampered its ability to address abuse on the platform.

Milo Yiannopoulos Loses His Book Deal and CPAC Speech After Pedophilia Views Surface
After an older video of the former Breitbart editor condoning pedophilia surfaced.

Some of those who responded said that they expect better reputation-management systems and moderation tools may help to solve the problem, but others said they fear that these kinds of tools will remove anonymity and make surveillance (including government monitoring) easier.

There was a certain fatalism underlying many of the responses that expect the situation to remain unchanged, the Pew Center found.

"Social media will continue to generate increasingly contentious, angry, mob-like behavior," said Paul Edwards, a professor of information and history at the University of Michigan. "The phenomenon that underlies this behavior has been consistently observed since the early days of email, so there is no reason to think that some new technique or technology will change that."

An anonymous respondent told the Pew researchers that "human nature has not much changed over the past 2,000 years; I don’t expect much change over the next 10."

In a recent essay, Microsoft sociologist danah boyd (who spells her name using only lowercase letters) said much the same thing about the problem of "fake news" or false information online. Although many people wish that Facebook and Google could fix the problem, she explained, this is impossible because it is a social and cultural problem.

"No amount of 'fixing' Facebook or Google will address the underlying factors shaping the culture and information wars in which America is currently enmeshed," boyd says.

A number of respondents to the Pew study also noted that there is an economic incentive for social platforms and websites to encourage polarizing content, including fake news, because it drives engagement and thereby boosts revenue, which is dependent on advertising. "Technology companies have little incentive to rein in uncivil discourse," the report says.

Fake news and similar problems are also being fueled by the fact that governments and other political forces have found they can manipulate people's behavior, argues Laurent Schüpbach, a neuropsychologist at University Hospital in Zurich.

"The reason it will probably get worse is that companies and governments are starting to realise that they can influence people’s opinions that way," he told the Pew researchers. "And these entities sure know how to circumvent any protection in place. Russian troll armies are a good example of something that will become more and more common in the future."

Source : fortune.com

Categorized in Internet Privacy

A look back at a year in which the internet produced an American president and fake news, much of it driven by people who used to exist on the digital margins.

’Tis the season for summing up 2016, which millions worldwide consider one of the worst years in decades. It’s no coincidence that John Oliver devoted an especially long farewell video to the year in “Last Week Tonight” and, right up until the last minute on Saturday, everyone was biting their nails over which other beloved celebrity the year might take from us.
For liberals, the year’s most shocking event was Donald Trump’s election victory. This after a presidential campaign that seemed to break all the rules of American politics (written and unwritten), and was also the most internet-based in history.

If the latter statement sounds hackneyed, it’s because you’ve heard it ad nauseam over the last decade: Every election has brought a spate of articles proving that same fact, typically replete with statistics about the candidates’ extensive online activity. And the first web-based campaign ads drew cries of wonder from journalists throughout the world.

But that isn’t the story of 2016. Last year’s story is about an internet that we thought had been pushed to the margins.

Facebook, Instagram, Google and a plethora of other applications have organized our world into comfortable, well-padded boxes. They make sure the design won’t strain your eyes but, above all, they conceal entire worlds via hidden algorithms. About 95 percent of the time, these worlds never even come near your field of view, to the point that you effectively forget they were there.

{youtube}XX0XxxoZIX{/youtube}


Nowadays, the giants of Silicon Valley seem like omnipotent entities that know everything about us – including things we’d never think of by ourselves – and have the ability to monitor our every action (almost). In Israel in recent months, we’ve seen how Facebook does everything it can to police the conversation and make it more pleasant and comfortable by censoring certain words or implementing a hard-line blocking policy – even as it tries to leave responsibility for many issues to third parties.

Most of the time, we feel that the internet we experience via applications and major websites has become a relatively orderly and controlled place, one where nobody is truly anonymous or free.

But Trump’s victory proved that even Facebook, Google and Twitter have limitations and that, contrary to what many had grown accustomed to thinking, the internet doesn’t begin and end with them. Admittedly, Trump was borne aloft on waves of loathing for the U.S. government, fake news from home and abroad, and almost overt Russian support. But his victory was equally a victory of the free, unbridled internet – for better or worse.


Of course, it began with him: the troll-in-chief, who for years has used his Twitter account to attack and harass his opponents. But Trump wasn’t alone. An army of trolls rose around him, and not just on Facebook and Twitter. The very loosely organized white nationalist movement known as the alt-right, which became so identified with Trump’s campaign, arose on extreme right-wing websites like Breitbart and political groups on Reddit and sites like 4Chan and 8Chan.

Some of these trolls were people we first became aware of through their antifeminist attacks in online gaming (aka Gamergate). The young people in this group knew exactly how to exploit Facebook and Twitter for their own purposes, whether by disseminating racist messages that went viral or by smear campaigns against opponents and critics – like the female student who criticized Trump and became the subject of a hate campaign and rape threats. The voices of these supporters were also amplified by hundreds of thousands of fictitious bots, especially on Twitter, which helped disseminate their messages.

In this regard, Russia’s modus operandi also proved the ability of a relatively weak player (if you look at the technology gap) to use tools like hackers, fake news and paid battalions of trolls to influence countries that are far richer and more developed.

All these turned the U.S. presidential election into the most internet-dominated in history: Not because we’re online more than ever before (though we are), but because just when we thought we knew the web, it once again surprised us.


Groups that existed on its farthest fringes only yesterday helped to put Trump in the White House. Their troll revolution emerged from remote corners of the web and then grew by exploiting the organized, regulated social media that we still seem to think controls the world.

Source : http://www.haaretz.com/life/.premium-1.762500

Categorized in Online Research

Social media has sharpened humans' age-old appetite for public shaming, providing a stage and unlimited seating for a seemingly unending stream of immorality plays. Those who share even the simplest identifying details about themselves are vulnerable to being pushed into the glare of the spotlight.

The anonymity the Internet provides frees many individuals of the consequences they might face offline for being abusive to other people. Perhaps appearing to their friends, family and connections as ordinary people in the real world, these Jekyll-and-Hyde netizens transform into trolls to carry out their online assaults.

Anonymity has been a hot button issue for just about the entire life of the Internet, and although there is no 100 percent solution in sight, the situation is not entirely hopeless, according to Charles King, principal analyst atPund-IT.

"So long as public sites enable user anonymity, pathological behavior will continue, because it thrives in the shadows," he told TechNewsWorld. "Forcing abusers into the sunlight may be difficult or impossible -- but changes in rules, laws and enforcement practices could make their lives more complicated and less comfortable."

Deep Dive Into Dirt

We know what the problem looks like, thanks to big data and analytics.

Arecent analysisidentified more than17,000 tweets related to body shaming, for example, and ranked the most common terms Twitter users lobbed at others to shame them for their weight.

Artificial intelligence soon might be able to catch and moderate cruel posts mere moments after publication, suggested a University of Lisbon team of researchers who have leveraged machine learning toteach AI to suss out sarcasm.

For now, the moderation and reporting tools available aren't set up to prevent or discourage online abuse, said Rob Enderle, principal analyst at theEnderle Group.

"Reputation protection services can be used, but that doesn't scale well -- they target one person at a time -- and it can be really expensive if you have to litigate and your attacker has no money," he told TechNewsWorld.

What to Do?

It appearsRedditcurrently has the best system in place, in Enderle's view, as its shadow-blocking tools shield users from whomever they wish to block, while allowing offenders to keep their accounts. Offenders are none the wiser, barring some detective work.

"Of course, publicizing shamers so they lose their jobs, gym memberships, and get attacked themselves does work," he acknowledged, "and if it is done enough, that should change behavior."

However, that approach so far hasn't been used enough to make a difference, Enderle said.

That could change if social media sites and other forums were willing to make some changes.

They could take proactive steps that might make a difference, noted King, who pointed to a list ofsuggestions for Twitter, posted online by Randi Lee Harper, founder of the Online Abuse Prevention Initiative.

Those changes might result in a significant decrease in the prevalence of abuse on Twitter, but what will it take to inspire websites and their parent companies to intercede?

"Many, if not most, technology vendors bend over backward to avoid favoritism and maintain level playing fields for users of all stripes," King pointed out. "I respect that attitude, but it's often subject to being gamed by some users -- and in some circumstances has resulted in online environments that amplify abusive behavior."

Societal Shift

Machine learning tools one day might be capable of rejecting abusive comments before their intended targets ever see them. However, even if the companies running social networks work strenuously to stomp out online abuse, it's ultimately up to humans to ensure that humanity prevails.

The best line of defense against social shaming starts at home, suggested counselorScott A. Spackey.

"Family validation and bonding, and personal achievement with sports, school work and personal goals is the antidote to ANY source of social shaming," he told TechNewsWorld.

People are more immune to criticism from outsiders when they have evidence to the contrary, provided by self knowledge and by those in their inner circles, Spackey said. For example, it's easier to brush off being called "stupid" when one's grades indicate otherwise.

"We all need to remember there's no law against unfriending a social network contact at any time," he noted. "Virtual life has same rules as non-virtual life: You get to have the final say on who you interact with and what you are exposed to."

While it's ideal to teach those lessons in the home, it's never too late to improve oneself with education and re-education.

Pity the Fool?

When Playboy Playmate Dani Mathers snapchatted an image of an older woman nude in a locker room, that was an opportunity for education, according to relationship and etiquette expertApril Masini.

"It was a moment to talk about what happens, naturally, to our bodies," she told TechNewsWorld.

"There is a lesson for Ms. Mathers to learn that bodies age and they don't look the same at 20 as they do at 60 or 70 or 80, and that it's important to celebrate the changes of a healthy and aging human being," Masini said, "instead of mocking the change that is often difficult to endure because it's a signal life is slipping away -- as it should."

Mathers undoubtedly was "afraid of what she saw" to some degree, she suggested, and might not even be conscious of the aging of her own body.

"The impetus for body shamers and bullies is usually fear," Masini said. "We see bravado and mean-spirited posts -- we don't acknowledge the fear behind the person posting."

Source : technewsworld

Categorized in Internet Privacy

 

The internet is undoubtedly one of most important technical advances ever, but it's not always a very nice place.

It's the home of trolls and haters, a place where famous people and ordinary people alike are often subject to shocking threats, insults, and having their personal information published online (a practice known as doxing.)

It's a thorny problem for social-media sites like Twitter, which would rather protect free speech instead of police speech, and Facebook, where like-minded people can gather and affirm each other.

But Microsoft is taking harder stance against hate speech. On Friday it said it wants to make it easier for people to report online abuse in its consumer communities, which includes everything from Skype, OneDrive, and Outlook to gaming community Xbox Live.

"For many years we’ve sought to protect our customers by prohibiting hate speech and removing such content from our hosted consumer services. While neither our principles nor our policies are changing, we are refining some of our processes to make it easier for customers to report hate speech," explains Microsoft's chief online-safety officer Jacqueline Beauchere in a blog post.

To that end, Microsoft introduced a new form that makes it easier to report hate speech and a clear definition of the kinds of things that constitute hate speech. Anything that advocates violence or promotes hatred based on age, disability, gender, national or ethnic origin or race, religions or sexual orientation/gender identity is the kind of thing Microsoft will nix.

The new form also makes it easier for people to log a protest if their sites or posts were found to be in violation and blocked or removed.

Microsoft also recently joined other online firms to support the European Commission Code of Conduct countering illegal hate speech online, Beauchere says.

Source : http://www.businessinsider.com/microsoft-tries-to-silence-internet-haters-2016-8

 

Categorized in Search Engine

They’re turning the web into a cesspool of aggression and violence. What watching them is doing to the rest of us may be even worse

This story is not a good idea. Not for society and certainly not for me. Because what trolls feed on is attention. And this little bit–these several thousand words–is like leaving bears a pan of baklava.

It would be smarter to be cautious, because the Internet’s personality has changed. Once it was a geek with lofty ideals about the free flow of information. Now, if you need help improving your upload speeds the web is eager to help with technical details, but if you tell it you’re struggling with depression it will try to goad you into killing yourself. Psychologists call this the online disinhibition effect, in which factors like anonymity, invisibility, a lack of authority and not communicating in real time strip away the mores society spent millennia building. And it’s seeping from our smartphones into every aspect of our lives.

The people who relish this online freedom are called trolls, a term that originally came from a fishing method online thieves use to find victims. It quickly morphed to refer to the monsters who hide in darkness and threaten people. Internet trolls have a manifesto of sorts, which states they are doing it for the “lulz,” or laughs. What trolls do for the lulz ranges from clever pranks to harassment to violent threats. There’s also doxxing–publishing personal data, such as Social Security numbers and bank accounts–and swatting, calling in an emergency to a victim’s house so the SWAT team busts in. When victims do not experience lulz, trolls tell them they have no sense of humor. Trolls are turning social media and comment boards into a giant locker room in a teen movie, with towel-snapping racial epithets and misogyny.

They’ve been steadily upping their game. In 2011, trolls descended on Facebook memorial pages of recently deceased users to mock their deaths. In 2012, after feminist Anita Sarkeesian started a Kickstarter campaign to fund a series of YouTube videos chronicling misogyny in video games, she received bomb threats at speaking engagements, doxxing threats, rape threats and an unwanted starring role in a video game called Beat Up Anita Sarkeesian. In June of this year, Jonathan Weisman, the deputy Washington editor of the New York Times, quit Twitter, on which he had nearly 35,000 followers, after a barrage of anti-Semitic messages. At the end of July, feminist writer Jessica Valenti said she was leaving social media after receiving a rape threat against her daughter, who is 5 years old.

A Pew Research Center survey published two years ago found that 70% of 18-to-24-year-olds who use the Internet had experienced harassment, and 26% of women that age said they’d been stalked online. This is exactly what trolls want. A 2014 study published in the psychology journal Personality and Individual Differences found that the approximately 5% of Internet users who self-identified as trolls scored extremely high in the dark tetrad of personality traits: narcissism, psychopathy, Machiavellianism and, especially, sadism.

But maybe that’s just people who call themselves trolls. And maybe they do only a small percentage of the actual trolling. “Trolls are portrayed as aberrational and antithetical to how normal people converse with each other. And that could not be further from the truth,” says Whitney Phillips, a literature professor at Mercer University and the author of This Is Why We Can’t Have Nice Things: Mapping the Relationship Between Online Trolling and Mainstream Culture. “These are mostly normal people who do things that seem fun at the time that have huge implications. You want to say this is the bad guys, but it’s a problem of us.”

A lot of people enjoy the kind of trolling that illuminates the gullibility of the powerful and their willingness to respond. One of the best is Congressman Steve Smith, a Tea Party Republican representing Georgia’s 15th District, which doesn’t exist. For nearly three years Smith has spewed over-the-top conservative blather on Twitter, luring Senator Claire McCaskill, Christiane Amanpour and Rosie O’Donnell into arguments. Surprisingly, the guy behind the GOP-mocking prank, Jeffrey Marty, isn’t a liberal but a Donald Trump supporter angry at the Republican elite, furious at Hillary Clinton and unhappy with Black Lives Matter. A 40-year-old dad and lawyer who lives outside Tampa, he says he has become addicted to the attention. “I was totally ruined when I started this. My ex-wife and I had just separated. She decided to start a new, more exciting life without me,” he says. Then his best friend, who he used to do pranks with as a kid, killed himself. Now he’s got an illness that’s keeping him home.

Marty says his trolling has been empowering. “Let’s say I wrote a letter to the New York Times saying I didn’t like your article about Trump. They throw it in the shredder. On Twitter I communicate directly with the writers. It’s a breakdown of all the institutions,” he says. “I really do think this stuff matters in the election. I have 1.5 million views of my tweets every 28 days. It’s a much bigger audience than I would have gotten if I called people up and said, ‘Did you ever consider Trump for President?'”

Trolling is, overtly, a political fight. Liberals do indeed troll–sex-advice columnist Dan Savage used his followers to make Googling former Pennsylvania Senator Rick Santorum’s last name a blunt lesson in the hygienic challenges of anal sex; the hunter who killed Cecil the lion got it really bad.

But trolling has become the main tool of the alt-right, an Internet-grown reactionary movement that works for men’s rights and against immigration and may have used the computer from Weird Science to fabricate Donald Trump. Not only does Trump share their attitudes, but he’s got mad trolling skills: he doxxed Republican primary opponent Senator Lindsey Graham by giving out his cell-phone number on TV and indirectly got his Twitter followers to attack GOP political strategist Cheri Jacobus so severely that her lawyers sent him a cease-and-desist order.

The alt-right’s favorite insult is to call men who don’t hate feminism “cucks,” as in “cuckold.” Republicans who don’t like Trump are “cuckservatives.” Men who don’t see how feminists are secretly controlling them haven’t “taken the red pill,” a reference to the truth-revealing drug in The Matrix. They derisively call their adversaries “social-justice warriors” and believe that liberal interest groups purposely exploit their weakness to gain pity, which allows them to control the levers of power. Trolling is the alt-right’s version of political activism, and its ranks view any attempt to take it away as a denial of democracy.

In this new culture war, the battle isn’t just over homosexuality, abortion, rap lyrics, drugs or how to greet people at Christmastime. It’s expanded to anything and everything: video games, clothing ads, even remaking a mediocre comedy from the 1980s. In July, trolls who had long been furious that the 2016 reboot of Ghostbusters starred four women instead of men harassed the film’s black co-star Leslie Jones so badly on Twitter with racist and sexist threats–including a widely copied photo of her at the film’s premiere that someone splattered semen on–that she considered quitting the service. “I was in my apartment by myself, and I felt trapped,” Jones says. “When you’re reading all these gay and racial slurs, it was like, I can’t fight y’all. I didn’t know what to do. Do you call the police? Then they got my email, and they started sending me threats that they were going to cut off my head and stuff they do to ‘N words.’ It’s not done to express an opinion, it’s done to scare you.”

Because of Jones’ harassment, alt-right leader Milo Yiannopoulos was permanently banned from Twitter. (He is also an editor at Breitbart News, the conservative website whose executive chairman, Stephen Bannon, was hired Aug. 17 to run the Trump campaign.) The service said Yiannopoulos, a critic of the new Ghostbusters who called Jones a “black dude” in a tweet, marshaled many of his more than 300,000 followers to harass her. He not only denies this but says being responsible for your fans is a ridiculous standard. He also thinks Jones is faking hurt for political purposes. “She is one of the stars of a Hollywood blockbuster,” he says. “It takes a certain personality to get there. It’s a politically aware, highly intelligent star using this to get ahead. I think it’s very sad that feminism has turned very successful women into professional victims.”

A gay, 31-year-old Brit with frosted hair, Yiannopoulos has been speaking at college campuses on his Dangerous Faggot tour. He says trolling is a direct response to being told by the left what not to say and what kinds of video games not to play. “Human nature has a need for mischief. We want to thumb our nose at authority and be individuals,” he says. “Trump might not win this election. I might not turn into the media figure I want to. But the space we’re making for others to be bolder in their speech is some of the most important work being done today. The trolls are the only people telling the truth.”

The alt-right was galvanized by Gamergate, a 2014 controversy in which trolls tried to drive critics of misogyny in video games away from their virtual man cave. “In the mid-2000s, Internet culture felt very separate from pop culture,” says Katie Notopoulos, who reports on the web as an editor at BuzzFeed and co-host of the Internet Explorer podcast. “This small group of people are trying to stand their ground that the Internet is dark and scary, and they’re trying to scare people off. There’s such a culture of viciously making fun of each other on their message boards that they have this very thick skin. They’re all trained up.”

Andrew Auernheimer, who calls himself Weev online, is probably the biggest troll in history. He served just over a year in prison for identity fraud and conspiracy. When he was released in 2014, he left the U.S., mostly bouncing around Eastern Europe and the Middle East. Since then he has worked to post anti–Planned Parenthood videos and flooded thousands of university printers in America with instructions to print swastikas–a symbol tattooed on his chest. When I asked if I could fly out and interview him, he agreed, though he warned that he “might not be coming ashore for a while, but we can probably pass close enough to land to have you meet us somewhere in the Adriatic or Ionian.” His email signature: “Eternally your servant in the escalation of entropy and eschaton.”

While we planned my trip to “a pretty remote location,” he told me that he no longer does interviews for free and that his rate was two bitcoins (about $1,100) per hour. That’s when one of us started trolling the other, though I’m not sure which:

From: Joel Stein

To: Andrew Auernheimer

I totally understand your position. But TIME, and all the major media outlets, won’t pay people who we interview. There’s a bunch of reasons for that, but I’m sure you know them.

Thanks anyway,

Joel


From: Andrew Auernheimer

To: Joel Stein

I find it hilarious that after your people have stolen years of my life at gunpoint and bulldozed my home, you still expect me to work for free in your interests.

You people belong in a f-cking oven.


From: Joel Stein

To: Andrew Auernheimer

For a guy who doesn’t want to be interviewed for free, you’re giving me a lot of good quotes!


In a later blog post about our emails, Weev clarified that TIME is “trying to destroy white civilization” and that we should “open up your Jew wallets and dump out some of the f-cking geld you’ve stolen from us goys, because what other incentive could I possibly have to work with your poisonous publication?” I found it comforting that the rate for a neo-Nazi to compromise his ideology is just two bitcoins.

Expressing socially unacceptable views like Weev’s is becoming more socially acceptable. Sure, just like there are tiny, weird bookstores where you can buy neo-Nazi pamphlets, there are also tiny, weird white-supremacist sites on the web. But some of the contributors on those sites now go to places like 8chan or 4chan, which have a more diverse crowd of meme creators, gamers, anime lovers and porn enthusiasts. Once accepted there, they move on to Reddit, the ninth most visited site in the U.S., on which users can post links to online articles and comment on them anonymously. Reddit believes in unalloyed free speech; the site only eliminated the comment boards “jailbait,” “creepshots” and “beatingwomen” for legal reasons.

But last summer, Reddit banned five more discussion groups for being distasteful. The one with the largest user base, more than 150,000 subscribers, was “fatpeoplehate.” It was a particularly active community that reveled in finding photos of overweight people looking happy, almost all women, and adding mean captions. Reddit users would then post these images all over the targets’ Facebook pages along with anywhere else on the Internet they could. “What you see on Reddit that is visible is at least 10 times worse behind the scenes,” says Dan McComas, a former Reddit employee. “Imagine two users posting about incest and taking that conversation to their private messages, and that’s where the really terrible things happen. That’s where we saw child porn and abuse and had to do all of our work with law enforcement.”

Jessica Moreno, McComas’ wife, pushed for getting rid of “fatpeoplehate” when she was the company’s head of community. This was not a popular decision with users who really dislike people with a high body mass index. She and her husband had their home address posted online along with suggestions on how to attack them. Eventually they had a police watch on their house. They’ve since moved. Moreno has blurred their house on Google maps and expunged nearly all photos of herself online.

During her time at Reddit, some users who were part of a group that mails secret Santa gifts to one another complained to Moreno that they didn’t want to participate because the person assigned to them made racist or sexist comments on the site. Since these people posted their real names, addresses, ages, jobs and other details for the gifting program, Moreno learned a good deal about them. “The idea of the basement dweller drinking Mountain Dew and eating Doritos isn’t accurate,” she says. “They would be a doctor, a lawyer, an inspirational speaker, a kindergarten teacher. They’d send lovely gifts and be a normal person.” These are real people you might know, Moreno says. There’s no real-life indicator. “It’s more complex than just being good or bad. It’s not all men either; women do take part in it.” The couple quit their jobs and started Imzy, a cruelty-free Reddit. They believe that saving a community is nearly impossible once mores have been established, and that sites like Reddit are permanently lost to the trolls.

When sites are overrun by trolls, they drown out the voices of women, ethnic and religious minorities, gays–anyone who might feel vulnerable. Young people in these groups assume trolling is a normal part of life online and therefore self-censor. An anonymous poll of the writers at TIME found that 80% had avoided discussing a particular topic because they feared the online response. The same percentage consider online harassment a regular part of their jobs. Nearly half the women on staff have considered quitting journalism because of hatred they’ve faced online, although none of the men had. Their comments included “I’ve been raged at with religious slurs, had people track down my parents and call them at home, had my body parts inquired about.” Another wrote, “I’ve had the usual online trolls call me horrible names and say I am biased and stupid and deserve to be raped. I don’t think men realize how normal that is for women on the Internet.”

The alt-right argues that if you can’t handle opprobrium, you should just turn off your computer. But that’s arguing against self-expression, something antithetical to the original values of the Internet. “The question is: How do you stop people from being a–holes not to their face?” says Sam Altman, a venture capitalist who invested early in Reddit and ran the company for eight days in 2014 after one of its many PR crises. “This is exactly what happened when people talked badly about public figures. Now everyone on the Internet is a public figure. The problem is that not everyone can deal with that.” Altman declared on June 15 that he would quit Twitter and his 171,000 followers, saying, “I feel worse after using Twitter … my brain gets polluted here.”

Twitter’s head of trust and safety, Del Harvey, struggles with how to allow criticism but curb abuse. “Categorically to say that all content you don’t like receiving is harassment would be such a broad brush it wouldn’t leave us much content,” she says. Harvey is not her real name, which she gave up long ago when she became a professional troll, posing as underage girls (and occasionally boys) to entrap pedophiles as an administrator for the website Perverted-Justice and later for NBC’s To Catch a Predator. Citing the role of Twitter during the Arab Spring, she says that anonymity has given voice to the oppressed, but that women and minorities are more vulnerable to attacks by the anonymous.

But even those in the alt-right who claim they are “unf-ckwithable” aren’t really. At some point, everyone, no matter how desensitized by their online experience, is liable to get freaked out by a big enough or cruel enough threat. Still, people have vastly different levels of sensitivity. A white male journalist who covers the Middle East might blow off death threats, but a teenage blogger might not be prepared to be told to kill herself because of her “disgusting acne.”

Which are exactly the kinds of messages Em Ford, 27, was receiving en masse last year on her YouTube tutorials on how to cover pimples with makeup. Men claimed to be furious about her physical “trickery,” forcing her to block hundreds of users each week. This year, Ford made a documentary for the BBC called Troll Hunters in which she interviewed online abusers and victims, including a soccer referee who had rape threats posted next to photos of his young daughter on her way home from school. What Ford learned was that the trolls didn’t really hate their victims. “It’s not about the target. If they get blocked, they say, ‘That’s cool,’ and move on to the next person,” she says. Trolls don’t hate people as much as they love the game of hating people.

Troll culture might be affecting the way nontrolls treat one another. A yet-to-be-published study by University of California, Irvine, professor Zeev Kain showed that when people were exposed to reports of good deeds on Facebook, they were 10% more likely to report doing good deeds that day. But the opposite is likely occurring as well. “One can see discourse norms shifting online, and they’re probably linked to behavior norms,” says Susan Benesch, founder of the Dangerous Speech Project and faculty associate at Harvard’s Internet and Society center. “When people think it’s increasingly O.K. to describe a group of people as subhuman or vermin, those same people are likely to think that it’s O.K. to hurt those people.”

As more trolling occurs, many victims are finding laws insufficient and local police untrained. “Where we run into the problem is the social-media platforms are very hesitant to step on someone’s First Amendment rights,” says Mike Bires, a senior police officer in Southern California who co-founded LawEnforcement.social, a tool for cops to fight on-line crime and use social media to work with their communities. “If they feel like someone’s life is in danger, Twitter and Snapchat are very receptive. But when it comes to someone harassing you online, getting the social-media companies to act can be very frustrating.” Until police are fully caught up, he recommends that victims go to the officer who runs the force’s social-media department.

One counter-trolling strategy now being employed on social media is to flood the victims of abuse with kindness. That’s how many Twitter users have tried to blunt racist and body-shaming attacks on U.S. women’s gymnastics star Gabby Douglas and Mexican gymnast Alexa Moreno during the Summer Olympics in Rio. In 2005, after Emily May co-founded Hollaback!, which posts photos of men who harass women on the street in order to shame them (some might call this trolling), she got a torrent of misogynistic messages. “At first, I thought it was funny. We were making enough impact that these losers were spending their time calling us ‘cunts’ and ‘whores’ and ‘carpet munchers,'” she says. “Long-term exposure to it, though, I found myself not being so active on Twitter and being cautious about what I was saying online. It’s still harassment in public space. It’s just the Internet instead of the street.” This summer May created Heartmob, an app to let people report trolling and receive messages of support from others.

Though everyone knows not to feed the trolls, that can be challenging to the type of people used to expressing their opinions. Writer Lindy West has written about her abortion, hatred of rape jokes and her body image–all of which generated a flood of angry messages. When her father Paul died, a troll quickly started a fake Twitter account called PawWestDonezo, (“donezo” is slang for “done”) with a photo of her dad and the bio “embarrassed father of an idiot.” West reacted by writing about it. Then she heard from her troll, who apologized, explaining that he wasn’t happy with his life and was angry at her for being so pleased with hers.

West says that even though she’s been toughened by all the abuse, she is thinking of writing for TV, where she’s more insulated from online feedback. “I feel genuine fear a lot. Someone threw a rock through my car window the other day, and my immediate thought was it’s someone from the Internet,” she says. “Finally we have a platform that’s democratizing and we can make ourselves heard, and then you’re harassed for advocating for yourself, and that shuts you down again.”

I’ve been a columnist long enough that I got calloused to abuse via threats sent over the U.S. mail. I’m a straight white male, so the trolling is pretty tame, my vulnerabilities less obvious. My only repeat troll is Megan Koester, who has been attacking me on Twitter for a little over two years. Mostly, she just tells me how bad my writing is, always calling me “disgraced former journalist Joel Stein.” Last year, while I was at a restaurant opening, she tweeted that she was there too and that she wanted to take “my one-sided feud with him to the next level.” She followed this immediately with a tweet that said, “Meet me outside Clifton’s in 15 minutes. I wanna kick your ass.” Which shook me a tiny bit. A month later, she tweeted that I should meet her outside a supermarket I often go to: “I’m gonna buy some Ahi poke with EBT and then kick your ass.”

I sent a tweet to Koester asking if I could buy her lunch, figuring she’d say no or, far worse, say yes and bring a switchblade or brass knuckles, since I have no knowledge of feuding outside of West Side Story. Her email back agreeing to meet me was warm and funny. Though she also sent me the script of a short movie she had written (see excerpt, left).

I saw Koester standing outside the restaurant. She was tiny–5 ft. 2 in., with dark hair, wearing black jeans and a Spy magazine T-shirt. She ordered a seitan sandwich, and after I asked the waiter about his life, she looked at me in horror. “Are you a people person?” she asked. As a 32-year-old freelance writer for Vice.com who has never had a full-time job, she lives on a combination of sporadic paychecks and food stamps. My career success seemed, quite correctly, unjust. And I was constantly bragging about it in my column and on Twitter. “You just extruded smarminess that I found off-putting. It’s clear I’m just projecting. The things I hate about you are the things I hate about myself,” she said.

As a feminist stand-up comic with more than 26,000 Twitter followers, Koester has been trolled more than I have. One guy was so furious that she made fun of a 1970s celebrity at an autograph session that he tweeted he was going to rape her and wanted her to die afterward. “So you’d think I’d have some sympathy,” she said about trolling me. “But I never felt bad. I found that column so vile that I thought you didn’t deserve sympathy.”

When I suggested we order wine, she told me she’s a recently recovered alcoholic who was drunk at the restaurant opening when she threatened to beat me up. I asked why she didn’t actually walk up to me that afternoon and, even if she didn’t punch me, at least tell me off. She looked at me like I was an idiot. “Why would I do that?” she said. “The Internet is the realm of the coward. These are people who are all sound and no fury.”

Maybe. But maybe, in the information age, sound is as destructive as fury.

Source :http://time.com/4457110/internet-trolls/ 

Categorized in Internet Privacy

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media

Book Your Seat for Webinar - GET 70% OFF FOR MEMBERS ONLY      Register Now