As an engineer I'm often surprised at how people assess risk in the world, particularly, of course, the risks associated with technology.
Anyone who regularly listens to the news must think we live in a land of viperous gadgets that attack us and pollute our world. Yet, citizens of the industrialized West live in one of the safest and likely cleanest times in the history of the world. Life expectancy is up, infant mortality down, and diseases that killed millions are now eradicated.
We fear, instead, things like flying in airplanes. Yet to put that in perspective, dying or being injured in a commercial aircraft is about eight times less likely than being injured by some random object falling on us -- something then I'm sure none of us spend a moment worrying about when we start our day. So, at first I'm always puzzled by this fear of technology, that I realize we fear because we're human. Our rational side, the side that deals with numbers and scientific theories, is pretty new when compared to the primitive side of our brains, which has been surviving for a long time by making choices based on instinct alone.
For example, if someone asked you to store nuclear waste in your home your gut reaction would likely be "no!" Yet, it turns out that a typical family of four would accumulate about five pounds of nuclear waste in their life time. If stored inside a thick metal case, capable of withstanding a house fire or a flood, the waste would form an object about the size of a small orange, which when placed in a thick-walled cubicle would ensure safety for you and your family. Would you have this in your home? Likely no. I wouldn't. Yet it is in some sense rational to do so.
So, when assessing risk we're not completely rational, and likely never will be. For example, I'm a bit uncomfortable flying yet I'll do a very dangerous thing without even thinking about it: I ride my bike home at night, without a light, in a dark blue business suit.
My irrational behavior highlights what I think really worries people about technology. We worry not as much about the technology as about how humans interact with it.
For example, last year at a German nuclear power plant a monitor detected abnormally high radiation levels in an employee. This led officials to question the worker, who led them to an abandoned French military airfield in southern Germany. There in a blackberry bush lay a two inch long tube, wrapped in a rubber glove. Seeping out of it was a brown solution containing plutonium. The man's motive for taking this plutonium remains unclear. This episode highlights that it's people we fear, not technology.
Douglas Adams, the science fiction writer, captured this fear when he said: "A common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools."
Copyright 2004 William S. Hammack Enterprises