American philosopher Mike Huemer published a book a few months ago called Progressive Myths. He was kind enough to send me a review copy. You can return the favor by having a look at his Substack (Fake Nous).
The table of contents give you a good idea what to expect:
As he says, his role as a philosopher is to tell the truth as he sees it. The book is not meant to be politically balanced, it is mainly about the wrong ideas that (American) progressives, liberals, and wokes have, not about the wrong things that conservatives or reactionaries believe. In debunking style, each chapter covers some popular narrative or claim about some topic, e.g. a particular case of a police killing, climate catastrophism, the tax burden or whatever. To be sure to avoid strawmen, each chapter begins with a neutral description of the claim, major figures espousing it (e.g. high-ranking US politicians, prominent journalists), and then a fairly detailed debunking. The last 4 chapters take a step back and reflect on how to avoid believing wrong things. Since I was already mostly familiar with the content of the first 21 chapters, this latter part was the most interesting to me. He provides the following useful heuristic to finding reliable individuals. In the language of Hrishikesh Joshi, these are the truth experts, people to trust in cases where you don't have the time or expertise to dig into some topic, but nevertheless desire to have an opinion. He has a 9 step approach:
Another aid to acquiring true beliefs is to find public intellectuals who are reliable and who have already taken the time to research the topics you are interested in. For this purpose, the author’s general intelligence matters (stupid people tend to be unreliable); however, once you get above a certain level (which the great majority of public intellectuals in fact exceed), objectivity and fair-mindedness are much more important than raw intelligence. This is because a biased person with a high IQ can use their intelligence to rationalize what they want to believe, rather than using it as a tool to get to the truth. With that in mind, here are some signs of reliability:
1. Reliable thinkers tend to give non-circular arguments for their views, rather than simply assuming a controversial ideological viewpoint. That is, they will cite evidence that a neutral party could reasonably be expected to agree with and could see to be evidence for their view without having already accepted their ideology.
2. Reliable thinkers qualify claims. They will say that something is probably the case, or almost always true, rather than definitely always true. Authors who make too many absolute statements are likely to be oversimplifying and may not be thoughtful enough to notice exceptions. (But there are exceptions even to this; some things, such as mathematical truths, are really definitely always true.)
3. Reliable authors tend to acknowledge reasons pointing in different directions, particularly about controversial matters. If smart people disagree about some policy, then there usually are both costs and benefits to the policy. If an author does not acknowledge this, the author is probably not fair-minded. (Again, there are exceptions; some ideas really are just dumb. Use your judgment in each case.)
4. Relatedly, reliable authors tend to discuss objections to their arguments. Authors who never address objections either have never thought about objections (in which case their thought process is unreliable) or have thought of objections but decided not to mention them (in which case they may not be entirely forthcoming).
5. Reliable thinkers do not always agree with one of the standard political orientations. It is highly unlikely that either the Democrats or the Republicans are wrong about everything; probably each side is sometimes right and sometimes wrong. If someone believes one side is virtually always right and the other side virtually always wrong, that person is probably forming beliefs based on a tribal identity rather than objective examination of the issues.
6. Reliable thinkers are not overly emotional. Now, you might wonder: “Why shouldn’t I be emotional? Political issues are important! The future of the nation is at stake!” Well, perhaps so. Nevertheless, a person’s emotions can interfere with objective judgment. If someone is unable to control his emotions for purposes of public discourse, he probably also cannot control them for purposes of making objective judgments.
7. Reliable thinkers provide serious discussion of the evidence. They might cite academic studies, government reports, court documents, and so on. They do not simply opine based on armchair guesses about the empirical facts.
8. Reliable thinkers make sense. When you read their work, it will lead you through logical lines of thought. It will not simply assert things, or appeal to emotionally charged language, or give you vague impressions.
9. Reliable thinkers are clear. My general sense is that, if an author is very hard to follow, that is probably because either (i) the author himself is confused or (ii) the author is trying to persuade you by non-rational means, which is a red flag (being rationally persuaded generally requires a clear understanding).
I would add another useful heuristic: caveat emptor (buyer beware). If they are trying to sell you something, usually fake medicine/diets/therapy, they are probably not reliably about that matter (unless it's Substack subscriptions!). He follows his own prescriptions well. The legal cases he discusses are well-documented with quotes from courts, forensic experts and so on, the other chapters cite government and academic statistics, and the language and reasoning in the book is well written and easy to follow. Applying his own criteria leads us (well, me) to think Mike Huemer is a reliable, though fallible, guide to truth. He did avoid certain questions related to race differences, but not every book can go into every topic. I quite liked the book and I imagine the readers of this blog will too.
Of particular interest to me was the coverage of the climategate, the case of hacked emails from climate scientists. The mainstream conservative take-away from this was notably dumb, focusing on repeating a single quote which is not really damning ("trick" .. "to hide the decline"). The real damning parts are the emails where the scientists discuss how they play the academic game and further their politics behind the scenes:
In a third message, a peer reviewer asks another professor for help in finding reasons to reject a paper that he (the reviewer) disagrees with: “If published as is, this paper could really do some damage. It won’t be easy to dismiss out of hand as the math appears to be correct theoretically, but it suffers from the classic problem of pointing out theoretical deficiencies, without showing that their improved method is actually better in a practical sense.”
In another message, a scientist discussed the idea of organizing a boycott of the journal Climate Research because the journal had published some papers skeptical of mainstream views on global warming. Another told a journal that he would have nothing to do with them until they got rid of the editor who was accepting papers by climate skeptics.
In another message, the head of CRU, who was one of the authors of the next IPCC report, spoke of wanting to prevent two climate skeptics’ papers from being mentioned in the IPCC report: “I can’t see either of these papers being in the next IPCC report. Kevin and I will keep them out somehow—even if we have to redefine what the peer review literature is!” (Those papers ultimately did in fact get cited in the IPCC report.)283
I am familiar with these and recently covered some examples of citation biases by demands of the reviewers. The conclusion from this is that anonymous surveys of scientists are king because public statements and published papers reflect not merely what scientists believe, but also what gets censored one way or another:
This may also explain why the apparent consensus derived from reviews of the literature is so much stronger than the consensus found in direct surveys of scientists’ opinions: The majority is using its power to minimize expressions of dissent.
He also offers a model of the origins of vote by escalation of social justice:
So the left-wing professors after the 1960’s have kept the movement for “social justice” alive. Each generation of professors teaches the next generation the current theories of how America is horrifyingly oppressive. The new crop of academics then compete with each other to take the radical theorizing to greater extremes. Though academics like to style ourselves as bold innovators always open to new ideas, the truth is that our profession rewards timid people who swallow their field’s intellectual orthodoxy so completely that the only sort of “challenge” they can contemplate is to take the orthodoxy’s assumptions to an even greater extreme
Perhaps the best example of this is the ever-present racisms, from slavery to Jim Crow to racist drug laws (chapter 10) to microaggressions, structural racism, implicit bias (chapter 8), stereotype threat (chapter 9) etc. These are all attempts to keep the racism crisis going as an explanation for race gaps in social status that didn't disappear as expected. The reason why is clear enough: if racism is removed as the explanatory model for race gaps, it means something else is causing them, something not about social justice, something one cannot fix with more regulations and training sessions etc., maybe even something natural , that is, it leads straight to Cofnas-style hereditarianism after giving up on Sowell-type cultural models.
Finally, there's the open debate heuristic:
Aside: If the would-be speakers who disagree with you have bad arguments, then you could try verbally rebutting their arguments, which should expose the foolishness of their position. But if you secretly suspect that they have good arguments, which your side would be unable to rebut, then silencing them is the only defense. This is why people who are wrong, especially people who on some level know they are wrong, are the ones who most often call for speech restrictions.
The truth usually comes off better in open debate. So if your position is correct, you should probably want there to be an open debate; if your position is wrong, you should probably avoid open debate. Therefore, if you see a controversy in which one side is trying to stifle debate while the other welcomes open debate, you can make an inference about who is most likely right.
This is kind of an add-on to the 9-step approach above.
Imagine there was an organization entitled Committee for Open Debate on History. Would applying the heuristic quoted below lead one to infer the opponents of Committee for Open Debate on History are likely to be factually wrong about history?
> The truth usually comes off better in open debate. So if your position is correct, you should probably want there to be an open debate; if your position is wrong, you should probably avoid open debate. Therefore, if you see a controversy in which one side is trying to stifle debate while the other welcomes open debate, you can make an inference about who is most likely right.
I'm a fan. But think Chauvin was innocent.