When the whole world is calling you a crook,
one thing you have to stop and consider is that
maybe you're a fucking crook.
What the boss says goes.
Rule 1: The boss is always right
Rule 2: If the boss is ever wrong, refer to rule 1
1. The boss can be wrong, and the employees can be right. So it's best if everybody gets the benefit of the doubt, and we can hash it all out as we go.
2. Although this is a business and not a democracy, it's a business that exists within a democracy, and democracy is not a business. We all have rights that are not relinquished in exchange for a paycheck.
3. Earn cookies, get cookies. Earn shit, get shit. And that goes for bosses and employees alike.
That's it - let's get back to work.
While concerns about political and media polarization online are longstanding, our study suggests that polarization was asymmetric. Pro-Clinton audiences were highly attentive to traditional media outlets, which continued to be the most prominent outlets across the public sphere, alongside more left-oriented online sites. But pro-Trump audiences paid the majority of their attention to polarized outlets that have developed recently, many of them only since the 2008 election season.
Attacks on the integrity and professionalism of opposing media were also a central theme of right-wing media. Rather than “fake news” in the sense of wholly fabricated falsities, many of the most-shared stories can more accurately be understood as disinformation: the purposeful construction of true or partly true bits of information into a message that is, at its core, misleading. Over the course of the election, this turned the right-wing media system into an internally coherent, relatively insulated knowledge community, reinforcing the shared worldview of readers and shielding them from journalism that challenged it. The prevalence of such material has created an environment in which the President can tell supporters about events in Sweden that never happened, or a presidential advisor can reference a non-existent “Bowling Green massacre.”
Have you ever noticed that when you present people with facts that are contrary to their deepest held beliefs they always change their minds? Me neither. In fact, people seem to double down on their beliefs in the teeth of overwhelming evidence against them. The reason is related to the worldview perceived to be under threat by the conflicting data.
Creationists, for example, dispute the evidence for evolution in fossils and DNA because they are concerned about secular forces encroaching on religious faith. Anti-vaxxers distrust big pharma and think that money corrupts medicine, which leads them to believe that vaccines cause autism despite the inconvenient truth that the one and only study claiming such a link was retracted and its lead author accused of fraud. The 9/11 truthers focus on minutiae like the melting point of steel in the World Trade Center buildings that caused their collapse because they think the government lies and conducts “false flag” operations to create a New World Order. Climate deniers study tree rings, ice cores and the ppm of greenhouse gases because they are passionate about freedom, especially that of markets and industries to operate unencumbered by restrictive government regulations. Obama birthers desperately dissected the president's long-form birth certificate in search of fraud because they believe that the nation's first African-American president is a socialist bent on destroying the country.
In these examples, proponents' deepest held worldviews were perceived to be threatened by skeptics, making facts the enemy to be slayed. This power of belief over evidence is the result of two factors: cognitive dissonance and the backfire effect. In the classic 1956 book When Prophecy Fails, psychologist Leon Festinger and his co-authors described what happened to a UFO cult when the mother ship failed to arrive at the appointed time. Instead of admitting error, “members of the group sought frantically to convince the world of their beliefs,” and they made “a series of desperate attempts to erase their rankling dissonance by making prediction after prediction in the hope that one would come true.” Festinger called this cognitive dissonance, or the uncomfortable tension that comes from holding two conflicting thoughts simultaneously.
Two social psychologists, Carol Tavris and Elliot Aronson (a former student of Festinger), in their 2007 book Mistakes Were Made (But Not by Me) document thousands of experiments demonstrating how people spin-doctor facts to fit preconceived beliefs to reduce dissonance. Their metaphor of the “pyramid of choice” places two individuals side by side at the apex of the pyramid and shows how quickly they diverge and end up at the bottom opposite corners of the base as they each stake out a position to defend.
In a series of experiments by Dartmouth College professor Brendan Nyhan and University of Exeter professor Jason Reifler, the researchers identify a related factor they call the backfire effect “in which corrections actually increase misperceptions among the group in question.” Why? “Because it threatens their worldview or self-concept.” For example, subjects were given fake newspaper articles that confirmed widespread misconceptions, such as that there were weapons of mass destruction in Iraq. When subjects were then given a corrective article that WMD were never found, liberals who opposed the war accepted the new article and rejected the old, whereas conservatives who supported the war did the opposite ... and more: they reported being even more convinced there were WMD after the correction, arguing that this only proved that Saddam Hussein hid or destroyed them. In fact, Nyhan and Reifler note, among many conservatives “the belief that Iraq possessed WMD immediately before the U.S. invasion persisted long after the Bush administration itself concluded otherwise.”
If corrective facts only make matters worse, what can we do to convince people of the error of their beliefs? From my experience, 1. keep emotions out of the exchange, 2. discuss, don't attack (no ad hominem and no ad Hitlerum), 3. listen carefully and try to articulate the other position accurately, 4. show respect, 5. acknowledge that you understand why someone might hold that opinion, and 6. try to show how changing facts does not necessarily mean changing worldviews. These strategies may not always work to change people's minds, but now that the nation has just been put through a political fact-check wringer, they may help reduce unnecessary divisiveness.And here's the formula again:
Alice, Bob, and Chris are taken to the police station for questioning. Detectives know that one of them is the thief and that only one of them tells the truth.
Alice says, "I am not the thief."
Bob says, "Alice is the thief."
Chris says, "I am not the thief."
Who is the thief?And there's all kinds of good reasoning going on as people weigh in and suss it all out; and I'm OK at such things, but I'm really not very adept at the Game Theory thing (or whatever I'd have to be good at to get through this kinda stuff), so I just have to take an awful lot of people's word for things, y'know?
If Alice is the thief -
1. What she says is a lie... we have one liar
2. What Bob says is truth... we have the one saying truth.
3. I am not the thief... as we need a second liar, Chris saying he's not the thief is a lie, hence he's the thief... but we started assuming Alice was the thief. So, since we don't have two thieves, we discard it.
If Bob is the thief -
1. What Alicia says is truth... we have our one saying truth
2. What Bob says is truth as we started assuming Bob is the thief... this would be the second saying the truth. We know we can't have two persons saying the truth. Then Bob is NOT the thief.
If Chris is the thief -
1. What Alice says is truth... we have our one saying truth.
2. What Bob says is a lie... as Alice is NOT the thief. We have our first liar.
3. What Chris says is a lie... as he is indeed the thief..... We have our second liar.
So, if Chris were the thief, we have exactly two liars and one person saying the truth, as the problem states.
Therefore, Chris is the thief.
A cognitive bias refers to a systematic pattern of deviation from norm or rationality in judgment, whereby inferences about other people and situations may be drawn in an illogical fashion.[1] Individuals create their own "subjective social reality" from their perception of the input.[2] An individual's construction of social reality, not the objective input, may dictate their behaviour in the social world.[3] Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.[4][5][6]
Some cognitive biases are presumably adaptive. Cognitive biases may lead to more effective actions in a given context.[7] Furthermore, cognitive biases enable faster decisions when timeliness is more valuable than accuracy, as illustrated in heuristics.[8] Other cognitive biases are a "by-product" of human processing limitations,[9] resulting from a lack of appropriate mental mechanisms (bounded rationality), or simply from a limited capacity for information processing.[10]
A continually evolving list of cognitive biases has been identified over the last six decades of research on human judgment and decision-making in cognitive science, social psychology, and behavioral economics. Kahneman and Tversky (1996) argue that cognitive biases have efficient practical implications for areas including clinical judgment.[11]Some examples: