Comments on “Because rationality matters”

Comments

The meta-rationalist sneer and objectivity

fot's picture

I’m really baffled by your opening, and while maybe it is a matter of rhetoric that I just don’t like, I think there is some actual disagreement. I think, in response to the meta-rationalist’s sneer, the rationalist should sneer back at the meta-rationalist for wanting to not believe in the truth sometimes. As you say later, the rationalist’s conception of truth is mistaken. So it’s not about believing in truth - one should want to believe only true things - it’s about believing in a certain conception of truth.

“In what sense?” is a characteristically meta-rational question. “Yes, there is water: in the cells of the eggplant” is true in some sense—probably not a useful one. It’s false in another, more relevant sense.

I think this is the part where I disagree with you on a point you might be implicitly making here. Not only is the question of there being water false in some more relevant sense, that more relevant sense is the one and only sense belonging to the context of the question, and that context can be objectively determined. At least in actual real world examples, or once you sufficiently add detail to the hypothetical world.

If you have a friend over, and you’ve been talking about some irrelevant non-scientific subject, and your friend asks if you have water in your fridge, and you answer “yes, in the cells of the eggplant,” you are artificially attempting to move from the context of a visitor in your home asking about the contents of your device that stores food and drink to an unmotivated in a vacuum faux-scientific one. The context can be objectively determined through our general social skills and shared habits and shared enculturation. The fact that any reader sees that there is some problem with the cells in the eggplant answer is a demonstration of this.

What do “truth” and “belief” mean? These do sound like classic ivory-tower philosophical questions. Those are mostly nonsense and should be ignored. They are generally unanswerable; and having the “right” answers, if they existed, wouldn’t make any practical difference.

This again I think is mistaken, depending on how you interpret the question. If you interpret this question as asking for a definition or complete theory of truth and belief, this is a general intelligence complete problem which means it is essentially unanswerable. But we can do more that define, we can point and model and give case studies and gain practical skills and find general properties. As an example, there is my statement above: you should want to believe only things that are true. Classical “”“ivory-tower”“” philosophy is an important aspect of this sort of investigation, and you make certain arguments here about the nature of truth and belief and their meaning that certainly falls within the scope of classical philosophy. It also seems like it is in danger of a traditional STEM failure of dismissing the literature.

All in all, I think you are making numerous statements in the article that contradicts your closing point: “Recovering accurate, effective senses of “truth,” “belief,” and “rationality” requires a major re-thinking.”

Add new comment

Navigation

This page is in the section In the cells of the eggplant,
      which is in ⚒ Fluid understanding: meta-rationality,
      which is in ⚒ Sailing the seas of meaningness,
      which is in Meaningness and Time: past, present, future.

The next page in this section is The function and structure of the eggplant.

This page’s topic is Rationalism.

General explanation: Meaningness is a hypertext book (in progress), plus a “metablog” that comments on it. The book begins with an appetizer. Alternatively, you might like to look at its table of contents, or some other starting points. Classification of pages by topics supplements the book and metablog structures. Terms with dotted underlining (example: meaningness) show a definition if you click on them. Pages marked with ⚒ are still under construction. Copyright ©2010–2019 David Chapman.