Slouching Towards Oblivion

Showing posts with label free speech. Show all posts
Showing posts with label free speech. Show all posts

Sunday, November 19, 2023

Twixter



Antisemitism was rising online. Then Elon Musk’s X supercharged it.

After neo-Nazi protests in Charlottesville, white supremacists were confined mostly to fringe websites. Musk’s purchase of Twitter changed that.


In the weeks following the Oct. 7 Hamas attack on Israel, Twitter user @breakingbaht criticized leftists, academics and “minorities” for defending the militant group. But it wasn’t until the user spoke up on behalf of antisemites that he struck a viral chord with X owner Elon Musk.

Tech is not your friend. We are. Sign up for The Tech Friend newsletter.
The user blamed Jewish communities for bringing antisemitism upon themselves by supporting immigration to the United States, welcoming “hordes of minorities” who don’t like Jews and promoting “hatred against whites.”

“You have said the actual truth,” Musk responded. Soon, @breakingbaht had gained several thousand new followers — and the antisemitic conspiracy theory that Jews are causing the replacement of White people was ricocheting across the internet once again.

Antisemitism has long festered online, but the Israel-Gaza war and the loosening of content moderation on X have propelled it to unprecedented levels, coinciding with a dramatic rise in real-world attacks on Jews, according to several monitoring organizations.

Since Oct. 7, antisemitic content has surged more than 900 percent on X and there have been more than 1,000 incidents of real-world antisemitic attacks, vandalism and harassment in America, according to the Anti-Defamation League — the highest number since the human rights group started counting. (That includes about 200 rallies the group deemed to be at least implicitly supporting Hamas.)

Factors that predate the Gaza war laid the groundwork for the heightened antisemitic atmosphere, say experts and advocates: the feeling of empowerment some neo-Nazis felt during the Trump presidency, the decline of enforcement on tech platforms in the face of layoffs and Republican criticism, even the 11-day war between Israel and Hamas in 2021, which gave rise to harsh criticism of Israel’s actions and sustained antisemitism online.

But Musk plays a uniquely potent role in the drama, disinformation specialists say. His comments amplifying antisemitic tropes to his 163.5 million followers, his dramatic loosening of standards for what can be posted, and his boosting of voices that previously had been banned from the platform formerly known as Twitter all have made antisemitism more acceptable on what is still one of the world’s most influential social media platforms.

Musk’s endorsement of comments alluding to the great replacement theory — a conspiracy theory espoused by neo-Nazi demonstrators in Charlottesville in 2017 and the gunmen who killed people inside synagogues in Pittsburgh in 2018 and Poway, Calif., in 2019 — brought condemnation from the White House and advertising cancellations from IBM, Apple, Comcast, and Disney, among others.

Late Friday, Musk was unrepentant: “Many of the largest advertisers are the greatest oppressors of your right to free speech,” he tweeted after word of the cancellations spread. He did not respond to an emailed request for comment.

Joan Donovan, a former research director at Harvard University’s Shorenstein Center who now teaches at Boston University, included Musk in what she described as “a strata of influencers … who feel very comfortable condemning Jewish people as a political critique.”

“In moments where there is a lot of concern, these right-wing influencers do go mask-off and say what they really feel,” she said.

The Israel-Gaza war also has given new life to prominent Holocaust deniers who have proclaimed on X, Telegram and other platforms that the Hamas attacks that left hundreds of Israelis dead were “false flags.” The #Hitlerwasright hashtag, which surged during the 2021 war, has returned, with Memetica, a digital investigations firm, tallying 46,000 uses of the phrase on X since Oct. 7. Previously, the hashtag appeared fewer than 5,000 times per month.

The Center for Countering Digital Hate, a nonprofit focused on online extremism and disinformation, identified 200 posts that promoted antisemitism and other forms of hate speech amid the conflict. X allowed 196 of them to remain on the platform, the group said in a report.

Seventy-six of those posts amassed a collective 141 million views in 24 hours after an explosion at the al-Ahli hospital in Gaza City on Oct. 17. The majority of the posts appeared on X Premium accounts, a subscription service that grants a blue “verified” check mark to anyone willing to pay a monthly fee. Previously, such status was available only to public figures, journalists and elected officials.

“Elon Musk has shaped X into a social media universe that revolves around his beliefs and whims while still shaping politics and culture around the world. And he’s using it to spread the most disgusting lies that humans ever invented,” said Emerson Brooking, resident fellow at the Digital Forensic Research Lab of the Atlantic Council think tank and co-author of the 2018 book “LikeWar: The Weaponization of Social Media.”

Antisemitism goes mainstream

Hatred against Jews has long been a feature of the internet. Extremists were early adopters of social media platforms, using them to find like-minded people to share views that would be distasteful in other settings, Brooking said.

In the mid-2000s, lies spread by anonymous users on platforms such as 4chan and Usenet blamed Jews for the Sept. 11, 2001, attacks and for the 2008 financial crisis. But the most extreme antisemitism, such as Holocaust denial, remained largely confined to the fringe, said Oren Segal, vice president of the Center on Extremism at the ADL. Well-known Holocaust deniers had little access to mainstream news media.

By the 2010s, however, an internet subculture that repackaged antisemitism into something seemingly more palatable started to take shape — often on newer and less moderated platforms like Discord, 8chan, and Telegram, and also on mainstream services like Facebook and YouTube. Instead of swastikas, the currency became jokes, memes like Pepe the Frog, and terms for white supremacy like “alt-right.” The election of former president Donald Trump galvanized this group; Richard B. Spencer, then president of the white-supremacist National Policy Institute, made headlines by telling a meeting of supporters after Trump’s election victory, “Hail Trump! Hail our people! Hail victory!”

“Suddenly, racists and antisemites who had lived at the margins of society found that they had new legitimacy. And a rising generation of far-right Americans saw that it was okay to say and do hateful things, because the president was doing them already,” Brooking said.

The 2017 Unite the Right rally in Charlottesville, organized on Facebook and the gaming platform Discord, became the first time a broad group of Americans, watching on television and online, heard the slogan “Jews will not replace us,” chanted by a torch-carrying crowd seeking to prevent the removal of a statue of Confederate Gen. Robert E. Lee.

“We saw an inflection point where online expression had turned into bigger real-world organizing,” the ADL’s Segal said of the demonstration.

Trump did little to tamp down these ideas and often amplified them, occasionally retweeting antisemitic memes and famously saying “there were very fine people on both sides” of the Charlottesville rally, at which a neo-Nazi sympathizer drove his car into counterprotesters, killing a woman.

In an emailed statement, the Trump campaign denounced any effort to link the former president to antisemitism. “The real racists and antisemites are deranged Democrats and liberals who are marching in support of terrorist groups like Hamas and calling for the death of Israel,” the statement said. “There has been no bigger champion for Israel than President Trump, as evidenced by moving the U.S. Embassy to Jerusalem, signing laws that curb anti-Semitism, and much more.”

The statement added, “For a media organization like The Washington Post to make such a ridiculous charge proves it has its own racism and anti-Semitism issues they must address before casting stones.”

The Trump years also saw the rise of mass shooters steeped in antisemitic fabrications. In New Zealand, El Paso, Buffalo, and at the Tree of Life synagogue in Pittsburgh, shooters cited the great replacement theory as their inspiration, and in some cases posted manifestos about it.

Amid the growing violence, tech platforms that had taken a tolerant approach to antisemitic posts cracked down. YouTube banned Holocaust denial in 2019 and Meta did so in 2020, after CEO Mark Zuckerberg had defended not prohibiting such content just two years earlier. Both companies expanded their hate speech policies to include white-supremacist content in 2019.

Those actions sent antisemitism back to the fringes, and to newer services, such as Gab, that specifically catered to right-wing audiences. “What I can tell you is major accounts that were spreading antisemitism … were falling like dominoes,” said Heidi Beirich, co-founder of the Global Project Against Hate and Extremism. “They were quickly re-platforming themselves in places like Gab. But there they were more preaching to the choir as opposed to being able to radicalize random people.”

Then in 2022, Musk’s $44 billion purchase of Twitter closed.

The ripple effect

Musk had been saying for months that one of the reasons he wanted to buy Twitter was to embrace “free speech” and relax the platform’s content moderation practices. Hours after he took over, anonymous trolls flooded the site with racist slurs.

The rise in bigotry on the platform prompted civil rights groups to pressure advertisers — sometimes successfully — to pause spending on Twitter. Last November, Musk extended an olive branch to those activists, pledging in a private meeting not to reinstate banned accounts until there was a process to do that. That concession angered far-right influencers on the site, who accused him of being a traitor to their cause.

Later that month, Musk reinstated thousands of accounts — including Trump’s — that had been banned for threats, harassment and misinformation. Since then, hateful rhetoric on the platform has increased, researchers said.

Musk invited back banned Hitler apologists, sent out his own antisemitic tweets to his followers, and promoted the work of Great Replacement backers including former Fox News host Tucker Carlson. Those actions demolished the previous bounds of acceptable speech, inviting more people to weigh in with wild theories and emotions about religious and ethnic minorities.

On Wednesday, Gab’s official X account shared a meme celebrating that Musk had affirmed “Jews are the ones pushing anti-White hatred” along with the caption, “We are so back.” (The X post, which has since been deleted, was liked 19,000 times and viewed 720,000 times.)

On Friday, several major companies announced that they were pulling advertising from X, including Apple, Lionsgate Entertainment and Comcast, parent of NBCUniversal. In the first quarter of 2022, Apple was Twitter’s top advertiser, accounting for nearly $50 million in revenue. Media Matters, a nonprofit media watchdog, published a report showing that X has been placing ads for Apple, Bravo, IBM, Oracle, Amazon and more next to pro-Nazi content. On Saturday, Musk threatened to sue Media Matters, accusing it of misrepresenting “the real experience on X.”

Some news publishers also have pulled out of the platform. NPR shut down its X account in April after Musk falsely labeled the nonprofit broadcaster “state controlled media.” On Thursday, the journalist Casey Newton announced that he would be pulling Platformer, the independent tech news outlet he founded, from X and would no longer include posts on X in the Platformer newsletter.

“It’s the only way I know how to send the message that no one should be there, that this is not a place where you should be going to get news or to discuss news or to have a good time,” he told The Post. “It is just over. If you wouldn’t join Gab, or Parler, or Truth Social, there’s no reason you should be on X. I think it’s time for journalists and publishers, in particular, to acknowledge the new reality and to get the heck off that website.”

Newton said that media companies, including The Post, that continue to pay to advertise on the site are funding Musk’s hate campaigns. “Publishers have to look themselves in the mirror and ask, why did they get into this business in the first place?” he said. “Didn’t it have something to do with speaking out against oppression and bigotry and standing up in the face of oppression?”

A Post spokesperson declined to comment.

Hateful rhetoric that appears on X ripples out to the whole internet, normalizing an unprecedented level of antisemitic hate, experts said. “Twitter is the most influential platform in shifting sentiments,” said Imran Ahmed, CEO of the Center for Countering Digital Hate. “[It] has always had an outsize influence in determining what takes start to be perceived as the vox populi.” Musk has sued the CCDH for defamation over its reports on X.

The international reach of big social platforms such as Instagram and TikTok also has served to highlight tensions. TikTok has come under fire for videos critical of Israel or supportive of Palestinians that carry the #freepalestine hashtag; TikTok data show that many of those arise from predominantly Muslim countries, such as Malaysia and Lebanon, where support for Palestinians has long been high.

Dozens of high profile Jewish content creators issued an open letter to TikTok earlier this month, saying that the platform hadn’t done enough to counter hatred and abuse toward the Jewish community on the app. On Wednesday, many of those creators, along with prominent celebrities including Amy Schumer, Debra Messing and Sacha Baron Cohen, met with representatives from the company to voice their concerns. The conversation was heated and intense, according to creators who attended.

“We recognize this is an incredibly difficult and fearful time for millions of people around the world and in our TikTok community,” TikTok said in a statement. “Our leadership has been meeting with creators, civil society, human rights experts and stakeholders to listen to their experiences and feedback on how TikTok can remain a place for community, discovery, and sharing authentically.” Since Oct. 7, TikTok has removed more than 730,000 videos for hate speech, including content promoting antisemitism, the company said.

Content creator Montana Tucker, the granddaughter of Holocaust survivors who has more than 9 million followers on TikTok and 3 million on Instagram, attended the meeting with TikTok. She said she’s noticed a sharp uptick in antisemitism across all platforms, and plans to stay on X for now.

“It’s happening on every single app, unfortunately,” she said. “All of these people, I’m sure they would love for us to hide and to not post and to not share … but we need to be more vocal. We need to be on these apps and we need to continue to share. I think it’s more of a reason I need to start posting more on [X].”

Outside of social media, white supremacists and neo-Nazis have continued to use lightly moderated messaging platforms such as Telegram and group-run websites to distribute hate messages and propaganda since the Israel-Gaza war began, according to the Counter Extremism Project, a nonprofit that tracks the groups. The Global Project Against Hate and Extremism found that antisemitic and anti-Muslim posts on 4chan, Gab, Odysee, and Bitchute increased 461 percent from 618 to 3,466 from Oct. 6 to Oct. 8.

A researcher at the Institute for Strategic Dialogue, a London think tank that tracks hate and disinformation, said online extremists were having a “field day,” with far-right groups using Hamas propaganda to bolster antisemitic messages.

Russia’s sophisticated disinformation apparatus also has seized on the conflict. One of Russia’s widest ongoing campaigns, known as Doppelgänger, promotes fake articles on clones of major media websites. Links to the pages are sent out rapidly by large networks of automated accounts on X and Facebook.

For the past year, most of these articles have been aimed at undermining Western support for Ukraine, Russia’s top priority. But not long after Oct. 7, some Doppelgänger assets started promoting the idea that the United States cared far more about Israel and would stop sending Ukraine as much aid, according to Antibot4Navalny, a group of volunteers who track Russian disinformation on the internet.

More recently, the social media accounts amplified pictures of the Jewish Star of David spray-painted on buildings in Paris, according to the nonprofit E.U. DisinfoLab. That advanced multiple objectives, the organization said: It generated additional concern about possible increases in antisemitism in France. It likely encouraged antisemites to think they are greater in number. And above all, it focused attention on Israel, rather than Ukraine and Russia.

Benjamin Decker, founder of Memetica, said that a major portion of 4chan links to outside coverage of Israel and Hamas go to articles from media sources in Iran, China or Russia. “You can’t attribute it to these actors yet, but from the beginning there have been cross-platform communities with a vested interest in stoking hate,” he said. “There is a highly digital far-right community who loves celebrating the deaths of Jews, and that dovetails with Hamas.”

“We’re in a really dangerous place,” the CCDH’s Ahmed said. “There’s no clearer link between disinformation, conspiracy theories, and real world hate than there is with antisemitism.”

Monday, November 07, 2022

Today In ElmoLand

Elon Musk loves to spout off about "unfettered free speech" and "the first amendment means the right to express your views is absolute".



Tuesday, November 01, 2022

It's Elon's Twitter Now

There's a sewer-y part of practically everything. That doesn't mean you should seek it out, and live in it. Although, on the intertoobz at least, that's exactly what some folks are wont to do.

(pay wall)

A BASELESS CONSPIRACY theory about the assault of House Speaker Nancy Pelosi’s husband Paul Pelosi trended on Twitter Monday morning after being boosted all weekend by prominent conservatives and even new Twitter owner Elon Musk.

Hashtags including “PelosiGayLover,” “PelosiSmollett,” “PelosiGate,” and “Listen to the 911” appeared in the trending bar amid the proliferation of false claims about the attack and mockery of Pelosi. Several prominent right-wing figures pushed the false idea that both the attacker and Pelosi were in their underwear at the time of the assault, and that Pelosi knew his attacker and that they were actually lovers because Pelosi had referred to him as a “friend” while attempting to tip off 911 dispatchers as to his situation.


San Francisco police have debunked claims that both men were in their underwear and that Pelosi knew the attacker. The attacker, David DePape, 42, broke into the Pelosi home early Friday morning, allegedly shouting “Where’s Nancy?” before ultimately attacking Paul Pelosi with a hammer. Pelosi underwent “successful surgery to repair a skull fracture and serious injuries to his right arm and hands,” and according to a statement from the speaker’s office, “doctors expect a full recovery.”

Musk on Sunday tweeted (and later deleted) a story from right-wing rag The Santa Monica Observer claiming Paul Pelosi was not the victim of a break in, but that the attack was part of a domestic dispute with a male prostitute. It has been widely noted that the Observer has a history of publishing false claims, including that Hillary Clinton had died and been replaced with a body double. Musk later made fun of The New York Times for reporting that the tweet was based on a claim from a regular source of misinformation.

Former first son Donald Trump Jr. mocked the attack on Instagram and Twitter later on Sunday, posting a meme depicting a pair of underwear and a hammer with the text, “Got my Paul Pelosi Halloween costume ready.” Jr. captioned the post “OMG. The internet remains undefeated.”


Trump Jr. on Monday morning tweeted out a photo of a hammer in a holster, captioned “open carry in San Francisco”


Sitting politicians have also been mocking the attack. Rep. Clay Higgins (R-La.) tweeted, then deleted, an image mocking Speaker Pelosi, captioning the tweet, “That moment you realize the nudist hippie male prostitute LSD guy was the reason your husband didn’t make it to your fundraiser.”

Researchers have found that while DePape did hold anti-establishment ideologies, his online activity indicated a longstanding pattern of extremist beliefs, including QAnon conspiracies, Holocaust denail, false voter fraud claims, and screeds against trans people and “groomers.”

On Fox News, the hosts of Fox & Friends alluded to the conspiracy on air. “Something, something doesn’t make sense,” said host Pete Hegseth, adding that it “doesn’t add up.” Larry Elder, the former California gubernatorial candidate and frequent Fox News guest, mocked the attack at an event Sunday night, saying that between the DUI conviction and the assault Pelosi was “was hammered twice in six months.”
 

Professional conspiracy theorists beat the disinformation drum on Twitter all weekend. Dinsesh D’Souza began publishing an alternate version of events on social media virtually immediately after the attack happened. Former Trump administration hand Sebastian Gorka published what is allegedly a partial clip of the conversation between 911 dispatchers sending someone to Pelosi’s home, captioning the video “The Paul Pelosi 911 Lie…” (The clip does not feature Paul Pelosi himself speaking to officers.) Pizzagate conspiracy theorist Mike Cernovich claimed news outlets are “hiding facts from the public,” suggesting that a break in at the Pelosi home was implausible and baselessly alluding to a connection between the attack and Paul Pelosi’s recent DUI case.

Musk’s acquisition of Twitter has thrown the future of enforcement of anti-disinformation policies into chaos. Massive layoffs are expected at the social media company, and over the weekend Musk tweeted that he had not yet made any changes to Twitter’s content moderation policies, but reports indicated that instances of racist and hateful abuse on the platform skyrocketed in the immediate aftermath of Musk’s takeover.

On Monday morning, Fox News host Ainsley Earhardt defended Musk tweeting a conspiracy theory about Pelosi’s attack, labeling it a “free speech” issue.

I haven't decided yet whether to stay with Twitter and lurk, so I can keep an eye on it, or bail and try to find something else. 

So far, I've seen a lot more weird shit, but that could be me engaging with the crazies more than I have done (mostly Blue Check Crazies) so maybe the algorithm is slanted in their direction.

I have noticed though that I've lost a couple of hundred followers, which may be the beginnings of "Twexit".

I don't know. Interesting times.

Saturday, July 23, 2022

It Gets Worse


Please take all of this, strap it to a cinder block, and shove the whole thing up your ass.

Your pal,

Mike 

PS)

WaPo: (pay wall)

South Carolina bill outlaws websites that tell how to get an abortion

More states could follow, setting up a battle over the future of online speech across the country.


Shortly after the Supreme Court ruling that overturned the right to abortion in June, South Carolina state senators introduced legislation that would make it illegal to “aid, abet or conspire with someone” to obtain an abortion.

The bill aims to block more than abortion: Provisions would outlaw providing information over the internet or phone about how to obtain an abortion. It would also make it illegal to host a website or “[provide] an internet service” with information that is “reasonably likely to be used for an abortion” and directed at pregnant people in the state.

Legal scholars say the proposal is likely a harbinger of other state measures, which may restrict communication and speech as they seek to curtail abortion. The June proposal, S. 1373, is modeled off a blueprint created by the National Right to Life Committee (NRLC), an antiabortion group, and designed to be replicated by lawmakers across the country.


As the fall of Roe v. Wade triggers a flood of new legislation, an adjacent battleground is emerging over the future of internet freedoms and privacy in states across the country — one, experts say, that could have a chilling impact on First Amendment-protected speech.

“These are not going to be one-offs,” said Michele Goodwin, the director of the Center for Biotechnology and Global Health Policy at the University of California at Irvine Law School. “These are going to be laws that spread like wildfire through states that have shown hostility to abortion.”

Goodwin called the South Carolina bill “unconstitutional.” But she warned it’s unclear how courts might respond after “turning a blind eye” to antiabortion laws even before the Supreme Court overturned Roe.

Many conservative states’ legislative sessions ended before the Supreme Court’s decision, and won’t resume until next year, making South Carolina’s bill an anomaly. But some tech lobbyists say the industry needs to be proactive and prepared to fight bills with communications restrictions that may have complicated ramifications for companies.

“If tech sits out this debate, services are going to be held liable for providing basic reproductive health care for women,” said Adam Kovacevich, the founder and CEO of Chamber of Progress, which receives funding from companies including Google and Facebook.


Tech companies could soon be navigating a disparate patchwork of state laws, caught in the middle of a political tug of war between red states and blue states. Democrats are already considering new data privacy proposals to protect reproductive health data and other digital trails that could be used to prosecute people seeking abortion. Meanwhile, Republican states could attempt to preserve and collect that same data, which has been used as key evidence in cases against pregnant women.

Eric Goldman, a professor at Santa Clara University School of Law, said the First Amendment and Section 230, a bill that shields internet providers and tech companies from liability for the posts, photos and videos people share on their sites, provide a strong defense in many instances for websites and providers facing lawsuits over hosting information about abortion access.


But individuals could face liability for aiding and abetting people in accessing a criminalized procedure if they send messages about how to obtain an abortion or otherwise break the law.

For the NRLC, which wrote the model legislation, limiting communication is a key part of the strategy to aggressively enforce laws restricting abortion. “The whole criminal enterprise needs to be dealt with to effectively prevent criminal activity,” Jim Bopp, the group’s general counsel, wrote in a July 4 memo, comparing the group’s efforts to fighting organized crime.

In an interview with The Washington Post, Bopp said that the group has refined its blueprint for states since the South Carolina bill was introduced last month. The restrictions on websites and internet hosts in the July model bill language would only apply when the information is likely to be used “for an unlawful abortion in this state,” he said, not abortions generally, as the South Carolina bill says.

The group “tried to be very careful in vetting this so it doesn’t impinge on First Amendment rights,” he added. He said the provision was intended to limit the trafficking of abortion-inducing drugs, which throughout the interview he compared to the trafficking of fentanyl.


Yet there’s broad uncertainty about how courts would interpret such bills, which might lead to companies and websites taking down information about abortions for fear of lawsuits.

“The legal ambiguity works in favor of regulators,” Goldman said. “They can suppress a lot of constitutionally protected speech just because of fear of liability.”

Democrats are expected to respond to the conservative states’ with their own regulatory efforts, largely focused on protecting sensitive data. California State Assembly member Mia Bonta introduced legislation earlier this year that would protect people from law enforcement requests from other states to turn over information that would identify people seeking an abortion.

A staffer in Bonta’s office said she introduced the legislation amid concerns that the Supreme Court would overturn Roe. Planned Parenthood Affiliates of California approached her with the concept of the legislation. The bill will have a hearing in August, and Bonta’s staff is working on amendments to strengthen the legislation in the wake of the Dobbs v. Jackson Women’s Health Organization decision.

“Just because the Supreme Court has decided to strip us of the fundamental right to choose what [to do] with our bodies, doesn’t mean California will stand back and allow others to use our systems to obtain information to hurt people who are exercising a fundamental right here in California,” Bonta said.


Democrats in Congress have also introduced the “My Body, My Data Act,” which would create new privacy protections for reproductive health data. The bill has little chance of becoming law in a narrowly divided Congress, but Rep. Sara Jacobs (D-Calif.), the legislation’s architect, previously told The Post that she wants states to replicate the bill.

Privacy and tech advocacy groups are trying to gear up for the post-Dobbs battles. The Center for Democracy and Technology on Tuesday announced a new task force focused on protecting reproductive health information, which convened academics, civil rights groups and privacy organizations.

The Electronic Frontier Foundation, a privacy advocacy group, expressed support for the California privacy bill and is reviewing the South Carolina legislation. Hayley Tsukayama, a senior legislative activist at EFF and a former Post reporter, said the South Carolina bill has “serious problems.”

She’s anticipating that tech companies and their trade associations will be ramping up their lobbying efforts at the state level, especially early next year, when many states resume their legislative calendars.

“For tech companies and for folks interested in digital rights, it’s going to be a wild ride in the next few years,” she said.


Thursday, October 14, 2021

From A Different Angle

It's easy to see how a growing affinity between Millennial youngsters and (eg) Boomer Progressives could be seen as a real threat to a "conservative" scheme aimed at installing a plutocracy.

So what's a good little fascist to do?

Divide-n-Conquer.

So how do I drive a wedge between two very important factions in order to keep them from joining forces and kicking my autocratic ass to the political curb?

The Young Turks take a crack at Cancel Culture as a kind of False Flag ploy - another example of how we can be manipulated by people who turn everything upside down and inside out.