Episode 7

#7 - Philosophy of Probability II: Existential Risks

00:00:00
/
01:37:32

July 7th, 2020

1 hr 37 mins 32 secs

Your Hosts
Tags

About this Episode

Back down to earth we go! Or try to, at least. In this episode Ben and Vaden attempt to ground their previous discussion on the philosophy of probability by focusing on a real-world example, namely the book The Precipice by Toby Ord, recently featured on the Making Sense podcast. Vaden believes in arguments, and Ben argues for beliefs.

Quotes
"A common approach to estimating the chance of an unprecedented event with earth-shaking consequences is to take a skeptical stance: to start with an extremely small probability and only raise it from there when a large amount of hard evidence is presented. But I disagree. Instead, I think the right method is to start with a probability that reflects our overall impressions, then adjust this in light of the scientific evidence. When there is a lot of evidence, these approaches converge. But when there isn’t, the starting point can matter.

In the case of artificial intelligence, everyone agrees the evidence and arguments are far from watertight, but the question is where does this leave us? Very roughly, my approach is to start with the overall view of the expert community that there is something like a one in two chance that AI agents capable of outperforming humans in almost every task will be developed in the coming century. And conditional on that happening, we shouldn’t be shocked if these agents that outperform us across the board were to inherit our future. Especially if when looking into the details, we see great challenges in aligning these agents with our values.
"
- The Precipice, p. 165

"Most of the risks arising from long-term trends remain beyond revealing quantification. What is the probability of China’s spectacular economic expansion stalling or even going into reverse? What is the likelihood that Islamic terrorism will develop into a massive, determined quest to destroy the West? Probability estimates of these outcomes based on expert opinion provide at best some constraining guidelines but do not offer any reliable basis for relative comparisons of diverse events or their interrelations. What is the likelihood that a massive wave of global Islamic terrorism will accelerate the Western transition to non–fossil fuel energies? To what extent will the globalization trend be enhanced or impeded by a faster-than-expected sea level rise or by a precipitous demise of the United States? Setting such odds or multipliers is beyond any meaningful quantification."
- Global Catastrophes and Trends, p. 226

"And while computers have been used for many years to assemble other  computers and machines, such deployments do not indicate any imminent self- reproductive capability. All those processes require human actions to initiate them,  raw materials to build the hardware, and above all, energy to run them. I find it hard to visualize how those machines would (particularly in less than a generation) launch, integrate, and sustain an entirely independent exploration, extraction, conversion, and delivery of the requisite energies."
- Global Catastrophes and Trends, p. 26

References:
- Global Catastrophes and Trends: The Next Fifty Years
- The Precipice: Existential Risk and the Future of Humanity
- Making Sense podcast w/ Ord  (Clip starts around 40:00)
- Repugnant conclusion
- Arrow's theorem
- Balinski–Young theorem

Support Increments