Margins
How to Actually Change Your Mind book cover
How to Actually Change Your Mind
2018
First Published
4.29
Average Rating
310
Number of Pages

Part of Series

“I believe that it is right and proper for me, as a human being, to have an interest in the future, and what human civilization becomes in the future. One of those interests is the human pursuit of truth, which has strengthened slowly over the generations (for there was not always science). I wish to strengthen that pursuit further, in this generation. That is a wish of mine, for the Future. For we are all of us players upon that vast gameboard, whether we accept the responsibility or not. “And that makes your rationality my business. “Is this a dangerous idea? Yes, and not just pleasantly edgy 'dangerous.' People have been burned to death because some priest decided that they didn’t think the way they should. Deciding to burn people to death because they ‘don’t think properly’—that’s a revolting kind of reasoning, isn’t it? You wouldn’t want people to think that way, why, it’s disgusting. People who think like that, well, we’ll have to do something about them...“I agree! Here’s my Let’s argue against bad ideas but not set their bearers on fire.” Human intelligence is a an amazing capacity that has single-handedly put humans in a dominant position on Earth. When human intelligence defeats itself and goes off the rails, the fallout therefore tends to be a uniquely big deal. In How to Actually Change Your Mind, decision theorist Eliezer Yudkowsky asks how we can better identify and sort out our biases, integrate new evidence, and achieve lucidity in our daily lives. Because it really seems as though we should be able to do better— —and a three-pound all-purpose superweapon is a terrible thing to waste.
Avg Rating
4.29
Number of Ratings
264
5 STARS
48%
4 STARS
37%
3 STARS
13%
2 STARS
1%
1 STARS
2%
goodreads

Author

Eliezer Yudkowsky
Eliezer Yudkowsky
Author · 20 books

From Wikipedia: Eliezer Shlomo Yudkowsky is an American artificial intelligence researcher concerned with the singularity and an advocate of friendly artificial intelligence, living in Redwood City, California. Yudkowsky did not attend high school and is an autodidact with no formal education in artificial intelligence. He co-founded the nonprofit Singularity Institute for Artificial Intelligence (SIAI) in 2000 and continues to be employed as a full-time Research Fellow there. Yudkowsky's research focuses on Artificial Intelligence theory for self-understanding, self-modification, and recursive self-improvement (seed AI); and also on artificial-intelligence architectures and decision theories for stably benevolent motivational structures (Friendly AI, and Coherent Extrapolated Volition in particular). Apart from his research work, Yudkowsky has written explanations of various philosophical topics in non-academic language, particularly on rationality, such as "An Intuitive Explanation of Bayes' Theorem". Yudkowsky was, along with Robin Hanson, one of the principal contributors to the blog Overcoming Bias sponsored by the Future of Humanity Institute of Oxford University. In early 2009, he helped to found Less Wrong, a "community blog devoted to refining the art of human rationality". The Sequences on Less Wrong, comprising over two years of blog posts on epistemology, Artificial Intelligence, and metaethics, form the single largest bulk of Yudkowsky's writing.

548 Market St PMB 65688, San Francisco California 94104-5401 USA
© 2025 Paratext Inc. All rights reserved
How to Actually Change Your Mind