April 15, 2008

More LHC Philosophy, This Time with Demons

cerndemon.pngThe debate continues on Ethical Perspectives on the News, now Toby Ord argues that These are not the probabilities you are looking for and Nicholas Shackel has Three arguments against turning the Large Hadron Collider on.

Toby's point is simple but important: most risk estimates are based on assumptions of known physics, which is after all what we want to test and extend into unknown domains. Hence such estimates are uncertain and not the right kind of risk estimates. On the other hand, it is not clear what kind of estimates would be the right kind.

Nick makes the argument that avoidable risks of destruction of all present and future goodness should not be taken, no matter how good their results could be. Unfortunately it is not entirely clear where this axiom comes from and what the limits to its applicability is. Since all actions have some (microscopic) risks of ending the world and in particular never tested actions and actions affecting the world globally reasonably have higher risks, it would seem to follow that I'm not allowed to even make this post, lest it cause some unlikely chain of events ending in doomsday.

I think the radical uncertainty problem of Toby is interesting: how are we supposed to act in such situations? When even arguing that we should not take any risks clearly has problems (as above) in the domain we know something about, it seems even more problematic in radically uncertain situations. In the end I think the real-world solution is just to muddle on, a bit cautiously (depending on the amount of newness, similarity to past risks and current theories) but largely based on past probability priors.

Maybe this can be viewed as a case of "Taleb's demon" (an invention by Peter Taylor, see this ppt). We have the usual urn with 10 black and 10 white balls. If I draw one, you can likely calculate the probability of me drawing a white one. But now Taleb's demon appear, and does something to the urn. What is the probability now? He could have removed balls or added extra white or black balls. Worse, he could have added red balls. Or frogs. The basis for normal probability theory is that we know the sample space, the space of possibilities. But here we are uncertain about even that.

The interesting thing is that it doesn't deter most people. We seem willing to extend our sample space based on scraps of information (before me mentioning frogs, you were probably still considering the urn as only holding balls) and experience - when I draw the concept of Democracy from the urn instead of a material object, you rapidly expand your sample space to cover at least all nouns. We also adjust our priors accordingly, most likely towards the "anything is possible" end of the spectrum if more outré examples come up.

I think we have a similar situation in the case of the LHC and risk. We know Taleb's demon has fudged with the urn, but until someone mentioned black holes that possibility was not on the agenda. The expansion of possibilities also makes us assign more probability/risk concern to events that we have no information about. If I start talking about the possibility of risk from self-replicating meson molecules people will start assigning risk to them automatically. Once a possibility is out of the bottle it cannot be put back in, and it gets assigned a risk level even when there is no information. Add a lot of possibility, and you add a lot of risk.

The only way of getting the risks under control is to make experiments. Drawing ten red, blue and white balls from the urn is not proof that it doesn't contain frogs or black holes, but it would calm us somewhat by reducing our priors. Similarly, the lack of disasters at the Brookhaven accelerator, the low supernova rate and the temporal location of Earth in the formation of solar systems in the galaxy should also be calming even if they are not conclusive evidence by any stretch. The Fermi Paradox on the other hand might be evidence that there is a problem somewhere.

Maybe the real answer to the Fermi paradox is that any sufficiently advanced civilization sees so many possibilities filled with risk that they end up hiding, not doing anything. It would surely fill Taleb's demon with glee.

Posted by Anders3 at April 15, 2008 06:25 PM
Comments