> Science is about creating *simple, causal explanations* of the world so that we can make good predictions. But science has had a hard time in certain parts of the world where simple, causal explanations seem to elude us: psychology, social sciences, economics, and more. Despite centuries of research, and little consistent progress we keep looking for scientific explanations in these areas because we have no good alternative
Science seeks simple, linear explanations out of a [[The world is a complex system of complex systems|complex]] world.
> AI changes this equation. It allows us to make predictions about parts of the world that we can’t yet explain with science. AI can screen for cancer now. It can also predict things like who is likely to be diagnosed with anxiety or depression, or, in theory, which interventions or practitioners are most likely to heal it. It can do this without needing any definitive, universal scientific explanation for what anxiety is and how it arises—which decades of research has failed to find.
> Nature doesn’t guarantee that simple explanations exist—even though we, as humans, are by nature attracted to them. This could represent the full-circle journey of science. At first, it tossed out human intuition and human stories in favor of equations and experiments. In the end, it might be that intuition and storytelling were the best ways for our minds to predict, and explain things that are beyond the ability of our more limited rational minds to comprehend. That would be something, indeed.
AI 通过数据的学习不需要通过语言的解释就能理解并预测