Chronicling the follies of religion and superstition, the virtues of skepticism, and the wonders of the real (natural) universe as revealed by science. Plus other interesting and educational stuff.
"Tell people there’s an invisible man in the sky who created the universe, and the vast majority believe you. Tell them the paint is wet, and they have to touch it to be sure."
“If people are good only because they fear punishment, and hope for reward, then we are a sorry lot indeed”.
“Skeptical scrutiny is the means, in both science and religion, by which deep thoughts can be winnowed from deep nonsense.”
Basically the Fermi Paradox is the problem of how few civilizations (0) we can see in the universe given its size, age, and that planets seem as common as stars. One possibility offered up is the idea that intelligent, techno-capable life forms such as ourselves are very likely to accidentally destroy ourselves, or to at least suffer catastropc set backs.
From the Wikipedia article:
“[Could it be] that technological civilizations may usually or invariably destroy themselves before or shortly after developing [advanced] technology? Possible means of annihilation include nuclear war, biological warfare or accidental contamination, climate change, nanotechnological catastrophe, ill-advised physics experiments,[Note 4] a badly programmed super-intelligence, or a Malthusian catastrophe after the deterioration of a planet’s ecosphere. This general theme is explored both in fiction and in mainstream scientific theorizing. Indeed, there are probabilistic arguments which suggest that human extinction may occur sooner rather than later. In 1966 Sagan and Shklovskii speculated that technological civilizations will either tend to destroy themselves within a century of developing interstellar communicative capability or master their self-destructive tendencies and survive for billion-year timescales.
From a Darwinian perspective, self-destruction would be an ironic outcome of evolutionary success. The evolutionary psychology that developed during the competition for scarce resources over the course of human evolution has left the species subject to aggressive, instinctual drives. These compel humanity to consume resources, extend longevity, and to reproduce—in part, the very motives that led to the development of technological society. It seems likely that intelligent extraterrestrial life would evolve in a similar fashion and thus face the same possibility of self-destruction. And yet, to provide a good answer to Fermi’s Question, self-destruction by technological species (or any sociological explanation) would have to be a near universal occurrence. Otherwise, the few civilizations to which it does not apply would colonize the galaxy.”
There are many possible answers to this riddle, but this particular one holds an important warning; add rampant superstition and tribalism to the mix and it’s a possibility that all humans, especially those in power, should be acutely aware of.