I’m reading the LessWrong Sequences from the beginning. Most of them are a repeat for me but there may be some sections I skipped originally, and of course plenty of things forgotten.
The sequences provide stepping stones of logic to help people internalize non-trivial concepts about rationality and similar topics.
I’m going to leave notes on the main ideas I take away. Mostly to serve as a reminder to me, but hopefully it encourages other people to read along.
The first sequence is Mysterious answers to mysterious questions
- A belief is only worth holding if it effects the distribution of anticipated experience. If following a chain of beliefs does not lead to an anticipated experience, or leads to a prediction that is contradicted by reality with sufficient evidence, the belief[s] should be deleted.
- No two rationalists can agree to disagree. This is a symptom of disagreeing on the facts. Check your definitions.
- Focus on narrow, small steps of inference. It may seem wise and impressive to generalize but it is much harder to prove, or identify logical errors while trying to do so.
- “A hypothesis that forbids nothing, permits everything, and thereby fails to constrain anticipation. Your strength as a rationalist is your ability to be more confused by fiction than by reality. If you are equally good at explaining any outcome, you have zero knowledge.” – Yudkowsky
- Absence of evidence is evidence of absence. A state of the world is less likely if we see no evidence for it.
- The expectation of the posterior probability (right side), after viewing the evidence, must equal the prior probability (left side). Evidence cannot be one-sided, it cannot just confirm or just deny.
- P(H) = P(H|E)*P(E) + P(H|~E)*P(~E). This is important:
If you expect a strong probability of seeing weak evidence in one direction, it must be balanced by a weak expectation of seeing strong evidence in the other direction. If you’re very confident in your theory, and therefore anticipate seeing an outcome that matches your hypothesis, this can only provide a very small increment to your belief (it is already close to 1); but the unexpected failure of your prediction would (and must) deal your confidence a huge blow. On average, you must expect to be exactly as confident as when you started out. Equivalently, the mere expectation of encountering evidence – before you’ve actually seen it – should not shift your prior beliefs. –Yudkowsky
- Hindsight bias is hard to deal with. We need to imagine a world before the evidence came in to truly admire the value of the science.
- An explanation is not an explanation if it does not constrain the probability space. Using scientific words does not add value if the meaning is not understood. Do not merely accept the naming of a scientific phenomena or theory as an explanation; use it’s implications to see if the evidence can truly be explained by the specific theory or system (and not just anything, “magic”).