12 Major Things Likely to Destroy The World

The Bible believes the World is 6,000 years old while Science argues that  the World is 4.54 billion years, and man who is the caretaker has been around for close to 7 million years, but that argument is for another day.
The basis of my post today is to look at some few things that is likely to destroy the World in years to come in pictorial view.

1) Catastrophic climate change

Sunset 
According to a 2013 World Bank report, "there is also no certainty that adaptation to a 4°C world is possible." Warming at that level would displace huge numbers of people as sea levels rise and coastal areas become submerged. Agriculture would take a giant hit.
Pamlin and Armstrong also express concern about geoengineering. In such an extreme warming scenario, things like spraying sulfate particles into the stratosphere to cool the Earth may start to look attractive to policymakers or even private individuals. But the risks are unknown, and Pamlin and Armstrong conclude that "the biggest challenge is that geoengineering may backfire and simply make matters worse."

2) Nuclear war

mushroom cloud
An image of a mushroom cloud from a nuclear test at Bikini Atoll, March 26, 1954. We'd need a lot of these to end the world. (Roger Viollet/Getty Images)
The "good" news here is that nuclear war could only end humanity under very special circumstances. Limited exchanges, like the US's bombings of Hiroshima and Nagasaki in World War II, would be humanitarian catastrophes but couldn't render humans extinct.
Even significantly larger exchanges fall short of the level of impact Pamlin and Armstrong require. "Even if the entire populations of Europe, Russia and the USA were directly wiped out in a nuclear war — an outcome that some studies have shown to be physically impossible, given population dispersal and the number of missiles in existence — that would not raise the war to the first level of impact, which requires > 2 billion affected," Pamlin and Armstrong write.
So why does nuclear war make the list? Because of the possibility of nuclear winter. That is, if enough nukes are detonated, world temperatures would fall dramatically and quickly, disrupting food production and possibly rendering human life impossible. It's unclear if that's even possible, or how big a war you'd need to trigger it, but if it is a possibility, that means a massive nuclear exchange is a possible cause of human extinction.

3) Global pandemic

flu
This kid can see the horrors to come. (Mario Villafuerte / Getty Images)
As with nuclear war, not just any pandemic qualifies. Past pandemics — like the Black Death or the Spanish flu of 1918 — have killed tens of millions of people, but failed to halt civilization. The authors are interested in an even more catastrophic scenario.
Is that plausible? Medicine has improved dramatically since the Spanish flu. But on the flip side, transportation across great distances has increased, and more people are living in dense urban areas. That makes worldwide transmission much more of a possibility.
Even a pandemic that killed off most of humanity would surely leave a few survivors who have immunity to the disease. The risk isn't that a single contagion kills everyone; it's that a pandemic kills enough people that the rudiments of civilization — agriculture, principally — can't be maintained and the survivors die off.

4) Ecological catastrophe

Ezra Klein interviews Elizabeth Kolbert about mass extinctions.
"Ecological collapse refers to a situation where an ecosystem suffers a drastic, possibly permanent, reduction in carrying capacity for all organisms, often resulting in mass extinction," the report explains.
Mass extinctions can happen for a number of reasons, many of which have their own categories on this list: global warming, an asteroid impact, etc. The journalist Elizabeth Kolbert has argued that humans may be in the process of causing a mass extinction event, not least due to carbon emissions. Given that humans are heavily dependent on ecosystems, both natural and artificial, for food and other resources, mass extinctions that disrupt those ecosystems threaten us as well.

5) Global system collapse

hyperinflation
Weimar Germany amidst hyperinflation, in 1923. We'd need something even worse if humanity as a whole's going to destroy itself. (Albert Harlingue/Roger Viollet/Getty Images)
This is a vague one, but it basically means the world's economic and political systems collapse, by way of something like "a severe, prolonged depression with high bankruptcy rates and high unemployment, a breakdown in normal commerce caused by hyperinflation, or even an economically-caused sharp increase in the death rate and perhaps even a decline in population."
The paper also mentions other possibilities, like a coronal mass ejection from the Sun that disrupts electrical systems on Earth.
That said, it's unclear whether these things would pose an existential threat. Humanity has survived past economic downturns — even massive ones like the Great Depression. An economic collapse would have to be considerably more massive than that to risk human extinction or to kill enough people that the survivors couldn't recover.

6) Major asteroid impact

asteroid impact
A simulation of a multi-kilometer asteroid impact. (Fredrick/NASA/Wikimedia commons)
Major asteroid impacts have caused large-scale extinction on Earth in the past. Most famously, the Chicxulub impact 66 million years ago is widely believed to have caused the mass extinction that wiped out the dinosaurs (an alternative theory blames volcanic eruptions, about which more in a second). Theoretically, a future impact could have a similar effect.
The good news is that NASA is fairly confident in its ability to track asteroids large enough to seriously disrupt human life upon impact, and detection efforts are improving. Scientists are also working on developing ways to deflect asteroids that would have a truly devastating effect, such as by crashing spacecraft into them with enough force to change their path, avoiding Earth.

7) Supervolcano

yellowstone ash
An example of the possible distribution of ash from a month-long Yellowstone supereruption. Keep in mind that such an eruption is extremely unlikely. (USGS)
As with asteroids, there's historical precedent for volcanic eruptions causing mass extinction. The Permian–Triassic extinction event, which rendered something like 90 percent of the Earth's species extinct, is believed to have been caused by an eruption.
Eruptions can cause significant global cooling and can disrupt agricultural production. They're also basically impossible to prevent, at least today, though they're also extremely rare. The authors conclude another Permian-Triassic level eruption is "extremely unlikely on human timescales, but the damage from even a smaller eruption could affect the climate, damage the biosphere, affect food supplies, and create political instability."
As with pandemics, the risk isn't so much that the event itself will kill everyone so much as that it'd make continued survival untenable for those who lived through it.

8) Synthetic biology

DNA double helix
What if we tweaked this so it killed everybody? (UIG via Getty Images)
This isn't a risk today, but it could be in the future. Synthetic biology is an emerging scientific field that focuses on the creation of biological systems, including artificial life.
The hypothetical danger is that the tools of synthetic biology could be used to engineer a supervirus or superbacteria that is more infectious and capable of mass destruction than one that evolved naturally. Most likely, such an organism would be created as a biological weapon, either for a military or a non-state actor.
The risk is that such a weapon would either be used in warfare or a terrorist attack, or else leak from a lab accidentally. Either scenario could wind up threatening humanity as a whole if the bioweapon spreads beyond the initial target and becomes a global problem. As with regular pandemics, actual extinction would only happen if survivors were unable to adapt to a giant population decline.

9) Nanotechnology

nanotech
John Winskas, a student in the nanotechnology research and education center at the University of South Florida, looks through a microscope at the tiny molecules that will seal our fates. (Joe Raedle/Getty Images)
This is another potential risk in the future. The concern here is that nanotech democratizes industrial production, thus giving many more actors the ability to develop highly destructive weapons. "Of particular relevance is whether nanotechnology allows rapid uranium extraction and isotope separation and the construction of nuclear bombs, which would increase the severity of the consequent conflicts," Pamlin and Armstrong write. Traditional balance-of-power dynamics wouldn't apply if individuals and small groups were capable of amassing large, powerful arsenals.
There's also a concern that self-replicating nanotech would create a "gray goo" scenario, in which it grows out of control and encroaches upon resources humans depend on, causing mass disruption and potentially civilizational collapse.

10) Artificial Intelligence

Terminator

Watson — champion Jeopardy! player, potential slayer of mankind. (Universal History Archive/Getty Images)
The report is also concerned with the possibility of exponential advances in artificial intelligence. Once computer programs grow advanced enough to teach themselves computer science, they could use that knowledge to improve themselves, causing a spiral of ever-increasing superintelligence.
If AI remains friendly to humans, this would be a very good thing indeed, and has the prospect to speed up research in a variety of domains. The risk is that AI has little use for humans and either out of malevolence or perceived necessity destroys us all.

11) Future bad governance

obama un

President Obama speaks about climate change at the UN, a perfect case study in the inability of global institutions to save humans from themselves. (Andrew Burton/Getty Images)
This is perhaps the vaguest item on the list — a kind of meta-risk. Most of the problems enumerated above would require some kind of global coordinated action to address. Climate change is the most prominent example, but in the future things like nanotech and AI regulation would need to be coordinated internationally.
The danger is that governance structures often fail and sometimes wind up exacerbating the problems they were trying to fix. A policy failure in dealing with a threat that could cause human extinction would thus have hugely negative consequences.

12) Unknown unknowns

unknown
Behold the face of Death, Destroyer of Worlds. (Shutterstock)
The first 11 items on the list are risks we can identify as potential threats worth tackling. There are almost certainly other dangers out there with grave potential impacts that we can't predict. It's hard to even think about how to tackle this problem, but more research into global catastrophic risks could be helpful.

Comments

Popular posts from this blog

How Technology Is Killing Eye Contact

Was Bobbi Kristina Used for Illuminati Satanic Blood Sacrifice?

Rainbow Fm apologize to the Public for bringing false report on Iyabo Obasanjo