It’s clear that Americans are more divided than ever. Our self-segregating tendencies have been reinforced by the adoption of Internet technologies and algorithms that personalize our newsfeeds to the point that we walk side-by-side down the same streets in different mental worlds.
Before the web, before iPhone, Netflix, and Facebook, the physical limits of radio, television, and print technology meant that we had to share. We had to share the airwaves and primetime and the headlines because they were limited resources.
In the pre-Internet world print was the cheapest communication to scale and thus the most variable. Anyone with a few hundred bucks could print a newsletter but these self-published efforts were clearly inferior to the major newspapers. You could tell yellow journalism from Pulitzer winners just by the look of the typography and feel of the paper in your hands. This was true with books and magazines as well. Quality of information was for the most part synonymous with quality of production.
To put on a radio or TV show you had to be licensed and you needed equipment and technical skills from unionized labor. Broadcast was more resource intensive and thus more controlled than print and thus more trusted. In 1938 The War of Worlds radio drama fooled otherwise skeptical Americans into believing they were under attack by Martian invaders. The audience was fooled because the show was presented not as a radio play but a series of news bulletins breaking into otherwise regularly scheduled programming.
The Broadcast technologies of the pre-social media world coerced us into consensus. We had to share them because they were mass media, one-to-many communications where the line between audience and broadcaster was clear and seldom crossed.
Then came the public Internet and the World Wide Web of decentralized distribution. Then came super computers in our pockets with fully equipped media studios in our hands. Then came user generated content, blogging, and tweeting such that there were as many authors as there were audience members. Here the troll was born.
Before the Internet the closest we got to trolling was the prank phone call. I used to get so many prank phone calls as high schooler in the 1970s that I simply answered the phone with a prank: “FBI HQ, Agent Smith speaking, how may I direct your call?” Makes me crack up to this day!
If you want to blame some modern phenomenon for the results of the 2016 presidential election, and not the people who didn’t vote, or the flawed candidates, or the FBI shenanigans, then blame the trolls. You might think of the typical troll as a pimply-faced kid in his bedroom with the door locked and the window shades taped shut but those guys are angels compared to the real trolls: the general public. You and me.
Every time you share a link to a news article you didn’t read (which is something like 75% of the time), every time you like a post without critically thinking about it (which is almost always), and every time you rant in anger or in anxiety in your social media of choice you are the troll.
I can see that a few of my favorite journalists and Facebook friends want to blame our divided culture, the spread of misinformation, and the outcome of the election on Facebook. But that’s like blaming the laws of thermal dynamics for a flood or the laws of motion for a car crash. Facebook, and social media in general, was the avenue of communication not the cause. In technology terms, human society is a network of nodes (people) and Facebook, Google, and Twitter are applications that provide easy distribution of information from node to node. The agents that cause info to flow between the social network nodes are human beings not algorithms.
It’s hard not to be an inadvertent troll. I don’t have the time to read and research every article that a friend has shared with me. I don’t have the expertise to fact-check and debunk claims outside of my area of expertise. Even when I do share an article about a topic I deeply understand, it’s usually to get a second opinion.
From a tech perspective, there are a few things Facebook, Google, and Twitter can do to keep us from trolling each other. Actually, Google is already doing most of these things with their Page Rank algorithms and quality scores for search results. Google even hires human beings to test and verify the results of their search results. Thus, it’s really hard for us to troll each other with phony web pages claiming to be about cats when dogs are the topic. Kudos to Google!
The following advice is for Facebook and Twitter from admiring fan…
First, hire human editors. You’re a private company not a public utility. You can’t be neutral, you are not neutral, so stop pretending to be neutral. I don’t care which side you pick, just pick a side, hire some college educated, highly opinionated journalists, and edit our news feeds.
Second, give us a “dislike” button and along with it “true” and “false” buttons. “Like” or “retweet” are not the only legitimate responses that human beings have to news. I like the angry face and the wow face but those actions are feelings and thus difficult to interpret clearly in argumentation and discourse. Dislike, true, and false would create strong signals that could help drive me and my friends to true consensus through real conversations.
Third, give us a mix of news that you predict we would like and not like. Give us both sides or all sides. And use forensic algorithms to weed out obvious trash like fake news sites, hate groups with nice names, and teenagers pretending to be celebrities.
A/B test these three ideas, and better ones, and see what happens. My bet is social media will be a healthier place but a small place with less traffic driven by the need to abuse each other.
We’ll still try to troll the hell out of each other but it will be more time consuming. Trolling is part of human nature and so is being lazy. So just make it a little harder to troll.
Before social media our personal trolling was limited to the dinner table or the locker room. Now our trolling knows no bounds because physical limits don’t apply on the Internet. We need limits, like spending limits on credit cards, before we troll ourselves to death.
Comments
4 responses to “Trolls Are USA”
John- this is such an important and relevant perspective. I’m sure I’m guilty of “trolling” to some degree, but I’ll also admit to at least “skimming” the majority of the articles I like or add to my feed. That said- the “alt-media” (I won’t just blame the alt-right here..) does such dis-justice to us all by creating completely false and misleading “headlines” that are not even slightly supported by the “reporting” in the attached article. And, even when “fact checking”…the “fact-checkers” are now under attack as being biased and misleading. I’m afraid that the Genie is out of the bottle…and “trolling” is here to stay. It amazes me what people will say if it’s “just on the internet” that they would NEVER consider saying face to face or in mixed company. The boundary between the “dinner table” or the “locker room” has been over-run, and I’m afraid it’s never going to be rolled back. Great Post. Peace
Yes, I agree trolling is here forever. But we still need to take the same responsibility online for our actions that we do offline. In fact your online actions are shared with potentially millions while your offline actions are constrained with in a few dozen feet of your location. If you’re going to be an inadvertent troll do it in real life not on the Internet.
I think that your intuition that most of this is unintentional trolling is correct. I don’t think that uniformed people pass on bogus information maliciously. But malicious or not, the effect is damaging.
I wonder though, about the backlash against the kind of editorial function you propose. Pseudo-arguments along the lines of “thought police” and “1st Amendment Rights”. So then, in response, Social Network Inc. decides that it needs to create a multi-partisan editorial board to oversee this function to ensure fairness, balance (ha), and eliminate political bias. This naturally becomes contested territory and political structures crystallize around it…but within the context of a private company…..Then, even further, consider the implications of the international nature of services like Facebook for this politicized process.
So, to my mind, Benjamin Bratton’s idea that global scale computation has profound implications for the future of the idea of national sovereignty makes a lot of sense.
So when you say “just” make it a bit harder to troll…I suspect that the task is actually much larger and more complicated than it seems…..
…but I certainly support the effort.
Great article! Thanks!
I like your ideas. But your hate group may be my golden light.
I tend to call shots on my FB feed: E.g. someone posted scare article about TB among Syrians in Minnisota.
I went, read the article, and posted back to her that:
A: There were some 500 cases among several million Minnisotans.
B: It’s not surprising that 80% of them were among a group of former refugees.
C: TB is not very contagious.
D: TB is not expensive to treat.
Similarly one way I have now to evaluate any article is to look at the domain’s home page. If that page is full of crackpot headlines, I will point out to the author of the post some of the other titles.
In some cases I know the site. E.g. Washington Times or the New York Post. Both are trash sites with about as much credibility as the National Inquirer and Midnight Star.
It’s an uphill battle. But if more people point out the fallacies, we collectively make the signal to noise ratio better.