Quantcast
Channel: The Controversial Files
Viewing all articles
Browse latest Browse all 4246

Science Will Destroy Humanity, Says Team Of Scientists

$
0
0
Science Will Destroy Humanity, Says Team Of Scientists

One of the primary goals of science is to advance knowledge and understanding to improve the human condition, but all too often this noble field of study has devolved into a profit-seeking quest for power, at the expense of mankind.

Indeed, the science of technology is perhaps the worst culprit, a team of mathematicians, philosophers and scientists at Oxford University's Future of Humanity Institute is warning.

The team, in a forthcoming paper titled, Existential Risk Prevention as Global Priority, says humankind's over-reliance on technology could lead to its demise, and that human beings are facing a risk to our own existence.

What's more, the team says humankind's demise is not far off; it could come as soon as the next century.

'Threats we have no track record of surviving...

"There is a great race on between humanity's technological powers and our wisdom to use those powers well," institute director Nick Bostrom told MSN. "I'm worried that the former will pull too far ahead."

Since our existence on this planet there have been those who have predicted the end of world as we know it, the latest "fad" in this realm being the hoopla surrounding the now-disproven 2012 Mayan prophesies. Still, folks can't seem to let go of the notion that, at some point in our future, life on Earth will cease to exist.

From Bostrom's paper:

Humanity has survived what we might call natural existential risks for hundreds of thousands of years; thus it is prima facie unlikely that any of them will do us in within the next hundred. This conclusion is buttressed when we analyze specific risks from nature, such as asteroid impacts, supervolcanic eruptions, earthquakes, gamma-ray bursts, and so forth: Empirical impact distributions and scientific models suggest that the likelihood of extinction because of these kinds of risk is extremely small on a time scale of a century or so.

In contrast, our species is introducing entirely new kinds of existential risk - threats we have no track record of surviving. Our longevity as a species therefore offers no strong prior grounds for confident optimism. Consideration of specific existential - risk scenarios bears out the suspicion that the great bulk of existential risk in the foreseeable future consists of anthropogenic existential risks - that is, those arising from human activity.

Continuing, Boston predicts that future technological breakthroughs "may radically expand our ability to manipulate the external world or our own biology."

"As our powers expand, so will the scale of their potential consequences - intended and unintended, positive and negative."

Bostrom goes onto say that well-known threats like an asteroid strike on the planet, supervolcanic eruptions and earthquakes likely won't threaten humanity in the near future. Even a nuclear explosion won't completely wipe out life; in that event, he says, enough people would survive to rebuild.

Rather, it is the unknowns that will wind up as a bane on the existence of humankind.


Science Will Destroy Humanity, Says Team Of Scientists
 Science has an obligation to serve mankind

Not all of the news is bad, Bostrom says.

"The Earth will remain habitable for at least another billion years. Civilization began only a few thousand years ago. If we do not destroy mankind, these few thousand years may be only a tiny fraction of the whole of civilized human history," he writes.

Mike Adams, The Health Ranger, notes in an Infographic posted here at NaturalNews that the onus for protecting humanity falls on those who are creating the technology.

"If an action or policy has a suspected risk of causing harm to the public or to the environment, the burden of proof that it is NOT harmful falls on those taking the action," the graphic says.


SOURCE

Viewing all articles
Browse latest Browse all 4246

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>