Aug 18, 2020

You Thought The Tea Party Was Bad?


Remember hearing the "theory" that the Romans crashed and burned because they all got brain damage from the lead that leached out of their plumbing into their drinking water?

Well, it turns out that didn't happen, but...wow, man - maybe it really did happen that way and somebody needs you to believe otherwise because yada yada yada.

So, what if civilizations naturally arrive at a point where the weirdness of their everyday politics gets so out of whack that people just start making shit up, and over time, it spins so terribly out of control that society takes on the kind of mania that inevitably leads to self-destruction on a massive society-wide basis?

Guess what.

NYT:

QAnon Was a Theory on a Message Board. Now It’s Headed to Congress.

A supporter of the dangerous conspiracy theory won a primary runoff on Tuesday. The social media platforms have some soul-searching to do.

For almost three years, I’ve wondered when the QAnon tipping point would arrive — the time when a critical mass of Americans would come to regard the sprawling pro-Trump conspiracy theory not merely as a sideshow, but as a legitimate threat to safety and even democracy.

There have been plenty of potential wake-up calls. Among them: a 2018 standoff at the Hoover Dam with a QAnon believer, the 2019 murder of a Gambino crime family boss by a QAnon supporter who believed the boss was part of a deep-state cabal, an August 2019 F.B.I. report that warned that QAnon could spur domestic terrorism, a West Point report calling the movement “a security threat in the making,” and the April arrest of a QAnon follower who was found with a dozen knives while driving to “take out” Joe Biden, the former vice president and presumptive Democratic presidential nominee.

But it seems the true tipping point came this week. First was the report from Ari Sen and Brandy Zadrozny at NBC News about an internal Facebook investigation that gives the first real glimpse into the size of QAnon’s online footprint. The investigation found millions of members across thousands of QAnon groups and pages.

This was followed by a Guardian investigation that found “more than 170 QAnon groups, pages and accounts across Facebook and Instagram with more than 4.5 million aggregate followers.”


Then, on Tuesday, Marjorie Taylor Greene, a Republican from Georgia who has been vocal in her support of QAnon, won a primary runoff. (In recently uncovered blog posts, Ms. Greene said that Hillary Clinton had a “kill list” of political enemies and questioned whether the 2017 Las Vegas mass shooting was orchestrated in a bid to overturn the Second Amendment.) Given the deeply Republican makeup of Ms. Greene’s district, she is widely expected to be elected to Congress in November.

This week’s news is a sign of QAnon’s increasing influence in American cultural and political life. What started as a niche web of disproved predictions by an anonymous individual has metastasized into a movement that is now too big to be ignored.

The recent news also feels like a clear example of the real-world consequences of our broken information ecosystem. QAnon’s rise is the direct result of a world in which media and politics are distorted by the dizzying scale of social networks, by their lack of adequate content moderation, and by the gaming of algorithms and hashtags. While the social media platforms didn’t create QAnon, they created the conditions for it to thrive. One can draw a straight line from these companies’ decisions — or, more accurately, their inaction — to where we are today.

Take the sheer size and scope of Facebook’s QAnon networks as revealed by NBC and The Guardian. Those reports note that Facebook groups and pages — many of which are private and therefore harder to moderate — drive QAnon’s growth on the platform by providing community to the movement’s followers. That growth is a feature, not a bug — one that Facebook began to prioritize in the summer of 2017, just four months before the first post from “Q” appeared on the message board 4chan.

That summer, onstage at a Facebook summit, the company’s C.E.O., Mark Zuckerberg, announced that Facebook would shift its focus toward building “meaningful communities” to “bring the world closer together.” Mr. Zuckerberg also announced that the company would use an artificial intelligence system to recommend such communities to users.

Facebook’s recommendations systems, designed to prioritize the growth of groups, most likely supercharged the QAnon community — exposing scores of people to the conspiracy theory and then forging bonds among like-minded believers who could communicate, organize and spread their message further. As NBC News’s Ben Collins notes, this spread has intensified during the coronavirus pandemic as QAnon has become a hub for public health misinformation on Facebook. According to The Wall Street Journal, “the average membership in 10 large public QAnon Facebook groups swelled by nearly 600 percent from March through July, to about 40,000 from about 6,000.”

QAnon followed a similar growth strategy on platforms like YouTube, building channels around influencers savvy enough to game the platform’s recommendation algorithms. On Twitter, the communities formed around the successful manipulation of hashtags, efforts amplified by the Trump campaign and the president’s Twitter feed. (On Friday Mr. Trump refused to answer whether he supported QAnon.)

This online ecosystem has been attractive to some political candidates. “Politicians see the infrastructure QAnon has built on these platforms. They recognize it as increasing in power and see it as having a political benefit,” said Alex Kaplan, a researcher for the media watchdog group Media Matters for America who has been tracking the increase in QAnon supporters running for Congress. “There are true believers, yes, but many also see pandering to QAnon as a way to cultivate political support. They say, ‘why not use this infrastructure to get some benefit?’ — be it followers or money or votes.” Mr. Kaplan has reported that there are at least 20 candidates on the ballot in November who support or have spoken favorably of QAnon.

The overtures of campaigns like Ms. Greene’s — and President Trump’s — are only likely to become more overt as QAnon moves further into the mainstream. Journalists like Mr. Kaplan are concerned that more media coverage will lead to the conspiracy theory being normalized. “People should be worried. They should not get used to this,” he told me. “It’s crucial to remember this all started as a theory on a message board linked to white nationalists and trolls that President Trump was involved in a secret plot to take down the deep state and pedophiles. That’s what all of this is.”

For those who’ve been following and reporting on QAnon since its earliest days, this week has been disorienting and disheartening. “It’s a horrifying, humbling and depressing feeling to have seen something like this back when it was just a few forum posts, warn of its potential to infect the nation and end up right,” Paris Martineau, a technology reporter who wrote the first explainer on QAnon for a national news outlet in 2017, told me. “I feel like, over the past three years, there have been so many moments where I thought it had reached its zenith, but it was really only just getting started.”

It’s no coincidence that a technology reporter was one of the first to identify this phenomenon — indeed, much of the best coverage of the movement has come from those steeped in understanding of social networks. QAnon is a product of the modern algorithmically powered internet (a fact that reporters flocking to cover the movement need to be mindful of).

The stewards of the Big Tech platforms surely didn’t envision QAnon when they sought to connect the world. But they should have. Because the movement leveraged all the platforms’ tools exactly as they were intended to be used.

“When you bring people together, you never know where it will lead,” Mr. Zuckerberg said at the end of his speech in 2017.

He was right.


No comments:

Post a Comment