Each time they fail, you expect some recognition, but the need to censor is strong with the disinformation factions.
When you step back for a moment, you get the impression that maybe you sound like a tinfoil chapeau nutter. It starts to sound like around every corner there is another person wanting to silence you, and you begin to resemble the holed-up hysteric with a conspiracy wall adorned with clippings and red string connecting all the players! Except for one detail: The tangible examples keep rolling out for us to actually see.
Recall that we saw the Biden administration create the Disinformation Governance Board, which met its justified demise. But they have made a second effort at this, without the same level of promotion, and now involving more federal agencies.
Currently, Missouri Attorney General Eric Schmitt is deep in a lawsuit filed against the Biden administration regarding the concerted efforts made to have social media platforms silencing voices that they deemed to be dangerous to their pandemic response. So far, Schmitt has uncovered communications between numerous agencies and FaceBook, Twitter, and Google/YouTube, involving nearly 100 government officials, committed to controlling the COVID messaging.
Schmitt has emails where Meta (parent of FaceBook) declared how they would step up silencing efforts. “Increasing the strength of our demotions for COVID and vaccine-related content that third party fact checkers rate as ‘Partly False’ or ‘Missing Context.’ That content will now be demoted at the same strength that we demote any content on our platform rated ‘False.'”
And this is not all. Recently Facebook CEO Mark Zuckerberg was on the Joe Rogan podcast and he let it slip how involved the FBI had been in the silencing of the Hunter Biden laptop story. Zuckerberg detailed how the FBI came to his company before the laptop story broke, informing them a story was coming and then guiding them to treat this as misinformation as it bore the earmarks of a Russian disinformation campaign.
“Basically, the background here is the FBI, I think, basically came to us- some folks on our team and was like, ‘Hey, just so you know, like, you should be on high alert… We thought that there was a lot of Russian propaganda in the 2016 election. We have it on notice that, basically, there’s about to be some kind of dump … that’s similar to that. So just be vigilant,'”
This cannot be overstated. The FBI not only warned Facebook ahead of the story coming out, but that agency guided them on how to treat it. The entirely false narrative of the Hunter laptop being a Russian opposition campaign appears to have been concocted by the Feds. They were advising on how to treat this artificial narrative in the Fall of 2020, before the story came out, all while the agency had been in possession of the very real laptop since December of 2019.
And if all of this is not enough to have you buying Reynolds Wrap in bulk, now we have a new effort these social media companies have been working on to control the discourse on their platforms. NBC News details how they have been experimenting with techniques that are right in line with the FBI advance team efforts. The goal is for them to be “pre-emptively debunking misinformation or conspiracy theories by telling people what to watch out for.”
The practice is called “pre-bunking,” where they hope to coach users on what they are consuming. This effort began in 2020, with Twitter placing notifications on its central feed regarding the election. “These prompts will alert people that they may encounter misinformation, and provide them with credible, factual information on the subject,” Twitter said. This was not combating what the company saw as inaccurate information, it was the establishment of what the approved narrative would be. They were announcing what they considered to be acceptable discourse.
This latest evolution, being undertaken by Google, looks into the prospect of conditioning users on how to approach what the company considers to be disinformation. In studies, they are literally bringing in users to coach and educate them on how to properly take in information on the platform. The effort is to “find new ways to rebuild media literacy.”
Keith Srakocic
The reasons for this are laid out. The use of fact-checkers on social media outlets has not delivered the desired results, so some other way of getting a handle on the narrative was seen as a priority.
Other approaches such as traditional fact-checking have failed to make a dent in online misinformation. “Words like ‘fact-checking’ themselves are becoming politicized, and that’s a problem, so you need to find a way around that,” said Jon Roozenbeek, lead author of the study and a postdoctoral fellow at Cambridge University’s Social Decision-Making Lab.
And just in case you have not yet gotten the Orwellian aftertaste after taking in this messaging, how about this description of what their goals are in this new effort?
The researchers compared the effects to vaccination, “inoculating” people against the harmful effects of conspiracy theories, propaganda or other misinformation.
This has become downright dystopian. The failure on many levels to get proper compliance on the vaccination messaging over the past couple of years now sees them wanting to go the next step and “innoculate” us against the messaging that they consider to be improper and of which they disapprove.
You know, I might feel a lot less paranoid and conspiratorial if they did not keep coming out with rather ominous proposals sending me down those pathways. Now hand me a fresh roll of aluminum wrap…