Slouching Towards Oblivion

Showing posts with label anti-vax. Show all posts
Showing posts with label anti-vax. Show all posts

Monday, January 22, 2024

🚨 Breaking 🚨


Researches have established a direct link
between kids getting measles
and their parents being dumbass gullible rubes

Saturday, September 23, 2023

Because Of Course


The Anti-Truth Gang is vehemently opposed to efforts (being made by normal people) to show the world what a collection of thuggish assholes they are.


"In a word, they're scum - with my apologies to scum." --George Conley


Misinformation research is buckling under GOP legal attacks

An escalating campaign, led by Rep. Jim Jordan (R-Ohio) and other Republicans, has cast a pall over programs that study political disinformation and the quality of medical information online

Academics, universities and government agencies are overhauling or ending research programs designed to counter the spread of online misinformation amid a legal campaign from conservative politicians and activists who accuse them of colluding with tech companies to censor right-wing views.

The escalating campaign — led by Rep. Jim Jordan (R-Ohio) and other Republicans in Congress and state government — has cast a pall over programs that study not just political falsehoods but also the quality of medical information online.

Facing litigation, Stanford University officials are discussing how they can continue tracking election-related misinformation through the Election Integrity Partnership (EIP), a prominent consortium that flagged social media conspiracies about voting in 2020 and 2022, several participants told The Washington Post. The coalition of disinformation researchers may shrink and also may stop communicating with X and Facebook about their findings.

The National Institutes of Health froze a $150 million program intended to advance the communication of medical information, citing regulatory and legal threats. Physicians told The Post that they had planned to use the grants to fund projects on noncontroversial topics such as nutritional guidelines and not just politically charged issues such as vaccinations that have been the focus of the conservative allegations.

NIH officials sent a memo in July to some employees, warning them not to flag misleading social media posts to tech companies and to limit their communication with the public to answering medical questions.

“If the question relates in any way to misinformation or disinformation, please do not respond,” read the guidance email, sent in July after a Louisiana judge blocked many federal agencies from communicating with social media companies. NIH declined to comment on whether the guidance was lifted in light of a September appeals court ruling, which significantly narrowed the initial court order.

“In the name of protecting free speech, the scientific community is not allowed to speak,” said Dean Schillinger, a health communication scientist who planned to apply to the NIH program to collaborate with a Tagalog-language newspaper to share accurate health information with Filipinos. “Science is being halted in its tracks.”

Academics and government scientists say the campaign also is successfully throttling the years-long effort to study online falsehoods, which grew after Russian attempts to interfere in the 2016 election caught both social media sites and politicians unawares.

Interviews with more than two dozen professors, government officials, physicians, nonprofits and research funders, many of whom spoke on the condition of anonymity to discuss their internal deliberations freely, describe an escalating campaign emerging as online propaganda is rising.

Social media platforms have pulled back on moderating content even as evidence mounts that Russia and China have intensified covert influence campaigns; next week, the disinformation watchdog NewsGuard will release a study that found 12 major media accounts from Russia, China and Iran saw the number of likes and reposts on X nearly double after Musk removed labels calling them government-affiliated. Advances in generative artificial intelligence have opened the door to potential widespread voter manipulation. Meanwhile, public health officials are grappling with medical misinformation, as the United States heads into the fall and winter virus season.

Conservatives have long complained that social media platforms stifle their views, but the efforts to limit moderation have intensified in the past year.

The most high-profile effort, a lawsuit known as Missouri v. Biden, is now before the Supreme Court, where the Biden administration seeks to have the high court block a ruling from the U.S. Court of Appeals for the 5th Circuit that found the White House, FBI and top federal health officials likely violated the First Amendment by improperly influencing tech companies’ decisions to remove or suppress posts on the coronavirus and elections. That ruling was narrower than a district court’s finding that also barred government officials from working with academic groups, including the Stanford Internet Observatory. But the Biden Justice Department argues the injunction still contradicts certain First Amendment principles, including that the president is entitled to use his bully pulpit to persuade American companies “to act in ways that the President believes would advance the public interest.”

“The university is deeply concerned about ongoing efforts to chill freedom of inquiry and undermine legitimate and much needed academic research in the areas of misinformation — both at Stanford and across academia,” Stanford Assistant Vice President Dee Mostofi told The Post. “Stanford believes strongly in academic freedom and the right of the faculty to choose the research they wish to pursue. The Stanford Internet Observatory is continuing its critical research on the important problem of misinformation.”

Jordan has issued subpoenas and demands for researchers’ communications with the government and social media platforms as part of a larger congressional probe into the Biden administration’s alleged collusion with Big Tech.

“This effort is clearly intended to deter researchers from pursuing these studies and penalize them for their findings,” Jen Jones, the program director for the Center for Science and Democracy at the Union of Concerned Scientists, an environmental group that promotes scientific research, said in a statement.

Disinformation scholars, many of whom tracked both covid-19 and 2020 election-rigging conspiracies, also have faced an onslaught of public records requests and lawsuits from conservative sympathizers echoing Jordan’s probe. Billionaire Elon Musk’s X has sued a nonprofit advocacy group, the Center for Countering Digital Hate, accusing it of improperly accessing large amounts of data through someone else’s license — a practice that researchers say is common. Trump adviser Stephen Miller’s America First Legal Foundation is representing the founder of the conspiracy-spreading website, the Gateway Pundit, in a May lawsuit alleging researchers at Stanford, the University of Washington and other organizations conspired with the government to restrict speech. The case is ongoing.

Nadgey Louis-Charles, a spokeswoman for the House Judiciary Committee that Jordan chairs, said the Jordan-led investigation is focused on “the federal government’s involvement in speech censorship, and the investigation’s purpose is to inform legislative solutions for how to protect free speech.”

“The Committee sends letters only to entities with a connection to the federal government in the context of moderating speech online,” she said. “No entity receives a letter from the Committee without a written explanation of the entity’s connection to the federal government.”

Missouri Attorney General Andrew Bailey (R) in a statement said the federal government “silenced” information because “it didn’t fit their narrative.”

“Missouri v. Biden is the most important First Amendment case in a generation, which is why we’re taking it to the nation’s highest court,” he said.


‘A serious threat to the integrity of science’

In September 2022, an NIH council greenlit a $150 million program to fund research on how to best communicate health issues to the public. Administrators had planned the initiative for months, convening a strategy workshop with top tech and advertising executives, academics, faith leaders and physicians.

“We know there’s a lot of inaccurate health information out there,” said Bill Klein, the associate director of the National Cancer Institute’s Behavioral Research Program at a meeting approving the program. He showed a slide of headlines about how online misinformation hampered the response to the covid-19 pandemic, as well as other public health issues, including gun violence and HIV treatment.

The program was intended to address topics vulnerable to online rumors, including nutrition, tobacco, mental health and cancer screenings such as mammograms, according to three people who attended a planning workshop.

Yet in early summer 2023, NIH officials contacted some researchers with the news that the grant program had been canceled. NIH appended a cryptic notice to its website in June, saying the program was on “pause” so that the agency could “reconsider its scope and aims” amid a heated regulatory environment.

Schillinger and Rich Barron, the CEO of the American Board of Internal Medicine, warned that the decision posed “a serious threat to the integrity of science and to its successful translation” in a July article in the JAMA. In an interview with The Post, Barron noted that there are limited sources of funding for health misinformation research.


NIH declined requests for an interview about the decision to halt the program, but spokesperson Renate Myles confirmed in an email that the Missouri v. Biden lawsuit played into the decision. Myles said a number of other lawsuits played a role but declined to name them.

Myles said the litigation was just one factor and that budgetary projections and consideration of ongoing work also contributed to the decision. She said that an initial approval of a concept does not guarantee it will be funded and that NIH currently funds health communication research. The agency does not officially release numbers about funding in the area, but she said a working group estimated that NIH spent $760 million over five years.

“NIH recognizes the critical importance of health communications science in building trust in public health information and continues to fund this important area of research,” she said.

NIH and other public health agencies have also sought to limit their employees’ communications with social media platforms amid the litigation, according to internal agency emails viewed by The Post that were sent in July after a Louisiana judge blocked many federal agencies from communicating with social media companies.

In one instance, an NIH communications official told some employees not to flag misleading social media posts to tech companies — even if they impersonated government health officials or encouraged self-harm, according to a July email viewed by The Post. The employees were told they could not respond to questions about a disease area or clinical trial if it did “relate in any way to misinformation or disinformation.”

The Election Integrity Partnership may also curtail its scope following lawsuits questioning the validity of its work, including the Missouri v. Biden case.

Led by the Stanford Internet Observatory and the University of Washington’s Center for an Informed Public, the coalition of researchers was formed in the middle of the 2020 presidential campaign to alert tech companies in real time about viral election-related conspiracies on their platforms. The posts, for example, falsely claimed Dominion Voting Systems’ software switched votes in favor of President Biden, an allegation that also was at the center of a defamation case that Fox News settled for $787 million.

In March 2021, the group released a nearly 300-page report documenting how false election fraud claims rippled across the internet, coalescing into the #StopTheSteal movement that fomented the Jan. 6 attack at the U.S. Capitol. In its final report, the coalition noted that Meta, X (formerly Twitter), TikTok and YouTube labeled, removed or suppressed just over a third of the posts the researchers flagged.

But by 2022, the partnership was engulfed in controversy. Right-wing media outlets, advocacy groups and influencers such as the Foundation for Freedom Online, Just the News and far-right provocateur Jack Posobiec argued that the Election Integrity Partnership was part of a coalition with government and industry working to censor Americans’ speech online. Jordan has sent several legal demands to see the coalition’s internal communications with the government and social media platforms and hauled them into Congress to testify about their work.

Louis-Charles, the Judiciary Committee spokeswoman, said in a statement that the universities involved with EIP “played a unique role in the censorship industrial complex given their extensive, direct contacts with federal government agencies.”


The probe prompted members of the Election Integrity Partnership to reevaluate their participation in the coalition altogether. Stanford Internet Observatory founder Alex Stamos, whose group helps lead the coalition, told Jordan’s staff earlier this year that he would have to talk with Stanford’s leadership about the university’s continued involvement, according to a partial transcript filed in court.

“Since this investigation has cost the university now approaching seven [figure] legal fees, it’s been pretty successful I think in discouraging us from making it worthwhile for us to do a study in 2024,” Stamos said.

Kate Starbird, co-founder of the University of Washington Center for an Informed Public, declined to elaborate on specific plans to monitor the upcoming presidential race but said her group aims to put together a “similar coalition … to rapidly address harmful false rumors about the 2024 election.”

Another participant in the Election Integrity Partnership, who spoke on the condition of anonymity, said the group was “looking at ways to do our work completely in the open” to avoid allegations that direct communications with the platforms are a part of a censorship apparatus.

The researchers have been encouraged by the recent ruling in the Court of Appeals for the 5th Circuit in the Missouri v. Biden litigation, which struck down a July 4 injunction that barred government officials from collaborating or coordinating with the Election Integrity Project, the Stanford Internet Observatory and other similar groups.

‘Naughty & Nice List’

In recent weeks, Jordan has sent a new round of record requests to at least two recipients of grants from the National Science Foundation’s Convergence Accelerator program, according to three people familiar with the matter.

The program, one of many run by the independent agency to promote research, awards funding to groups creating tools or techniques to mitigate misinformation, such as software for journalists to identify misinformation trending online.

George Washington University professor Jonathan Turley and the conservative advocacy group The Foundation for Freedom Online wrote separate reports portraying the program as an effort by the Biden administration to censor or blacklist American citizens online. Afterward, Jordan requested grant recipients’ communications with the White House, technology companies and government agencies, according to two of the people.

Turley said in a statement that “free speech is a core value of higher education” and that he is concerned that universities are using partnerships with the government to silence some users.

“If universities are supporting efforts to regulate or censor speech, there should be both clarity and transparency on this relationship. In past years, academics have demanded such transparency in other areas of partnership with the government, including military research,” Turley said. “Free speech values should be of equal concern to every institution of higher learning.”

Some NSF grant recipients who have not received requests from Jordan’s committee say they are facing a barrage of online threats over their work, which has prompted some to buy services that make it harder to find their addresses, such as DeleteMe.

Hacks/Hackers, a nonprofit coalition of journalists and technologists, received an NSF grant to develop tools to help people share accurate information about controversial topics, such as vaccine efficacy. The group has faced political scrutiny from Sen. Joni Ernst (R-Iowa), who tweeted they had received $5 million from President Biden to create “a naughty & nice list to police the content posted by family & friends” with her usual slogan “MakeEmSqueal.”


Connie Moon Sehat, a researcher-at-large for the group, said she and other researchers have faced online attacks including threats to reveal personal information and veiled death threats. She says members of her team are at times under high levels of stress and having ongoing conversations about how to elevate accurate information on social media, as some platforms become increasingly toxic.

“We are double- and triple-checking what we write, above what we used to, to try to communicate our good intentions — in the face of efforts that willfully misconstrue our work and desire to serve the public,” Sehat said. “And I worry more broadly that we researchers may self-censor our inquiry, or that some will drop out altogether, to stay safe.”

As Jordan’s probe expands, some university lawyers have urged academics to hold on to their records and be prepared to receive subpoenas from the committee, according to two people familiar with the matter.

The probe has sparked a wave of fear among university academics, prompting several to take a lower profile to avoid the scrutiny. Laura Edelson, an assistant professor of computer science at Northeastern University, recently left her role as chief technologist at the Justice Department’s antitrust division. She said she tailored her job search to only private universities that are not subject to public records laws.

“I knew that because of the way our field is being attacked that the cost of the work I do is a lot higher at a public institution,” she said. “I just didn’t want to pay that cost, and that’s why I only applied to private universities.”

The left-leaning nonprofit Center for Democracy and Technology argued in a Thursday report that the disinformation field is facing a dual threat: Social media platforms have become less responsive to concerns from researchers about misinformation while the political and regulatory backlash against the scholarship has eroded the relationships between academics, nonprofits and industry.

“The more efforts to recast counter-election-disinformation as censorship succeed, the more difficult it will become for governments and others to work with researchers in the field,” wrote the nonprofit, which receives some of its funding from tech corporations, including Google and Meta.

The scrutiny has caught the academic community by surprise, as non-faculty staff and researchers debate how to protect themselves from new legal threats. When Dannagal Young, a professor of communication and political science at the University of Delaware, alerted university lawyers that she’d been asked to talk with Democratic congressional staffers about potentially testifying before Jordan’s subcommittee, she felt her preparation was lacking.

While the lawyers were eager to help, according to Young, they spent more time prepping her on how to discuss President Biden’s relationship to the school than they did on what kinds of questions she might be asked on Capitol Hill.

“I don’t think university lawyers are prepared to navigate that kind of politically motivated space,” she said. The University of Delaware didn’t respond to a request for comment.

Many academics, independent scholars and philanthropic funders are discussing how to collectively defend the disinformation research field. One proposal would create a group to gather donations into a central fund to pay for crisis communications and — most critically — legal support if one of them gets sued or subpoenaed in a private case or by Congress. The money could also fund cybersecurity counseling to ward off hackers and stalkers and perhaps physical security as well.

“There is this growing sense that there need to be resources to allow for freedom of thought and academic independence,” said one longtime philanthropy grant maker who spoke on the condition of anonymity to discuss internal matters.

University academics are also mulling ways to rebrand their work to attract less controversy. One leader in a university disinformation research center said scholars have discussed using more generic terms to describe their work such as “information integrity” or “civic participation online.” Those terms “have less of a bite to them,” said a person, who spoke on the condition of anonymity to speak on the private discussions. Similar conversations are occurring within public health agencies, another person said.

“This whole area of research has become radioactive,” the person said.

Tuesday, April 04, 2023

On Being A Trouper

Heather McDonald was on stage talking about COVID shots and other vaccination stuff, when she passed out and immediately became a told-ya-so bugbear for the anti-vax idiots.




Heather McDonald’s on-stage collapse became anti-vaccine fodder, but she’s alive and joking

McDonald, like a half-dozen other people whose medical events are shown in the trailer for the anti-vaccine film "Died Suddenly," did not die as a result of the Covid-19 vaccine.

Three minutes into the trailer for the widely debunked anti-vaccine film “Died Suddenly,” comedian Heather McDonald is shown collapsing on stage.

In the background, a voiceover from people identified as “whistleblowers” lays out the film’s mission statement.

“It’s the new bullet. It’s the new form of warfare,” the voice of a man in a darkened room says about the Covid vaccine.

“The dead can’t speak for themselves, so therefore, I have to speak for them,” says another.

The idea that she can’t speak for herself comes as a surprise to McDonald, who recently sat in her studio in Woodland Hills, California, prepping for her weekly podcast. She has published it every week since she passed out on stage in February 2022 at the Tempe Improv in Arizona.

Since then, videos of her collapse have been viewed millions of times on social media. Joe Rogan talked about it on his podcast. Fox News published an article about her collapse and tweeted: “Comedian collapses on stage, fractures skull after declaring she’s triple vaxxed.”

McDonald, 52, said she’s getting used to getting recognized as a piece of propaganda.

“Sometimes there’ll be people who say, ‘Oh my God, I just saw you on something,’” McDonald said in an interview from her bright pink podcast studio. “And I’m like, ‘Sadly, I know what it is. It’s me fainting.’”

McDonald, like a half-dozen other people whose medical events are shown in the trailer for the anti-vaccine conspiracy theory film, did not die as a result of the Covid-19 vaccine. Many of them now live their lives with a strange internet notoriety, the kind that didn’t exist even just a few years ago.

The film has since been widely debunked, including by Reuters and FactCheck.org. Its issues even sparked concern from people within the anti-vaccine movement who worried it made them look bad. The person featured immediately after McDonald in the trailer, Keyontae Johnson, collapsed on Dec. 12, 2020, days before Covid vaccines were available or widely administered in the U.S. This month, Johnson made it to the Elite 8 of the NCAA Men’s Basketball Tournament with the rest of the Kansas State Wildcats and is a projected NBA Draft pick. (Johnson was eventually diagnosed with a heart condition unrelated to the Covid vaccine.)

That has done little to stymie the success of “Died Suddenly,” which has evolved from the title of a film into something of a rallying cry in the anti-vaccine world. The hashtag “#DiedSuddenly” trended on Twitter after the film’s release and has become a consistent internet trope, reappearing when high-profile medical events occur on television or in public.

Recounting the night she collapsed, McDonald said she felt dizzy at the start of her set and, had she not been on stage in front of friends and family for the first time since the beginning of the pandemic, would have simply sat back down instead of trying to fight through it. Afterward, McDonald received a battery of tests from doctors, who said that she had no underlying condition and that the fainting spell was not linked to the vaccine. McDonald suffered a fractured skull and a concussion, and still checks in with doctors but has had no lingering issues.

Video of her collapse quickly spread online. Hoping to clear up any confusion, McDonald released the video along with an update from doctors to her own social media accounts days after her collapse. While she joked at first that she didn’t mind the attention, releasing the video had the opposite effect, and she became overwhelmed by conspiracy theories about her health.

McDonald said she was quickly able to clarify her health status with her own listeners, and wondered what it’s like for people “who can’t go on their podcast and say ‘I’m fine.’”

Even with McDonald’s platform, the video of her collapse continues to circulate online. Videos tagged with “Heather McDonald collapse” have over 17 million views on TikTok, frequently outnumbering her own recently posted content in search results that she posts to her 370,000 followers.

“That’s what really made me sad: I thought I was a little bit better known than that,” joked McDonald.

McDonald has had problems counteracting the viral misinformation about her even among people she knows.

Joe Rogan, who she knew from backstage conversations at comedy sets in Los Angeles, played the video of McDonald’s collapse on his podcast and alluded to links to the vaccine.

“I DM’d him and I’m like, ‘Joe, do you not know who I am?’” McDonald said.

McDonald said Rogan did not respond to her direct message. Rogan and Spotify, his exclusive podcast distributor, did not respond to emailed requests for comment.

McDonald, who is vaccinated, said she has gone to great lengths to stay apolitical, not mentioning politics on her podcast — which focuses on reality TV gossip — since the 2016 election.

“I’m nonpolitical, and I get thrown into this thing just for doing my job and working,” she said.

Despite immediate and repeated debunking, the film has had a lasting impact in the anti-vaccine community even as the film’s producer, Stew Peters, has had to come up with increasingly bizarre ways to explain its inaccuracies.

Peters pushed the conspiracy theory that Buffalo Bills player Damar Hamlin, who collapsed on Jan. 2 after being hit in the chest on “Monday Night Football,” was dead or being hidden away as part of a global conspiracy theory to protect vaccine makers.

When Hamlin reappeared and gave public interviews, Peters repeatedly insisted that Hamlin had been replaced by a “body double.”

As the documentary’s main thesis — that the vaccine is causing mass death to young people — fails to bear out statistically, Peters has pivoted to new conspiracy theories about other maladies he attributes to the vaccine.

Earlier this month, Peters tweeted that men who received the MRNA vaccine “are essentially infertile and their penises are rotting off.”

Peters did not respond to a request for comment.

Still, Peters and the “Died Suddenly” crew maintain an audience with lawmakers. Rep. Paul Gosar, R-Ariz., appeared on Peters’ video podcast on March 14, above a scroll imploring viewers to watch “Died Suddenly.”

Interest in the film was renewed in the last month when an Idaho state senator and a representative introduced a bill that would make the administration of MRNA vaccines (the type used against Covid-19) a misdemeanor in the state.

One of the bill’s co-authors, state Sen. Tammy Nichols, has repeatedly implored her constituents and followers on Facebook to watch “Died Suddenly.”

“Everyone is talking about Died Suddenly on Rumble. Powerful!” she wrote on Nov. 22. “Watch Died Suddenly and stand up to this garbage,” she added the next day. Nichols, who did not respond to a request for comment, used the hashtag #DiedSuddenly as recently as Feb. 19.

Dr. Eric Burnett, an internal medicine doctor at the Columbia University Irving Medical Center, has recorded several videos on TikTok attempting to combat the lies in “Died Suddenly.”

Burnett said that he now sees people conflating disinformation about the potential harm from vaccines with the actual threat of Covid-related illnesses.

“Anti-vaxxers and these myth-spreaders, they operate in this bubble that doesn’t require evidence, that doesn’t require any burden of proof. They could say whatever they want, and it’s consequence-free for them,” Burnett said.

Despite repeated debunking of “Died Suddenly,” the lies about the videos featured in it won’t die — and even morph into new ones.

McDonald said the release of “Died Suddenly” coincided with a second wave of attention and abuse. People began posting that McDonald, who is a practicing Catholic, was spited and “flicked” by God for the joke she tried to deliver right before her collapse, in which she said Jesus loved her because she hadn’t yet gotten Covid-19.

“They say something mean, like, ‘You shouldn’t be alive because you got the vaccine.’ Or they’ll say, ‘You shouldn’t be alive because you joked about Jesus,’” McDonald said.

Despite millions of views across social media platforms, McDonald said she was simply left with people questioning her faith, part of an ever-evolving conspiracy theory in which it’s unclear if she’s even alive.

“I am in this business. I’d like to be known,” she said. “But this was just not any bonus for me at all.”