an independent, bipartisan commission created by congressional legislation and the signature of President George W. Bush in late 2002. It is chartered to prepare a full and complete account of the circumstances surrounding the September 11, 2001 terrorist attacks, including preparedness for and the immediate response to the attacks.
“The greatest danger of another catastrophic attack in the United States will materialize if the world’s most dangerous terrorists acquire the world’s most dangerous weapons.”
was the master criminal whose autobiography
Catch Me If You Can
was turned into a film by Steven Spielberg starring Leonardo DiCaprio and Tom Hanks.
“It is important to remember that technology breeds crime, it always has … it always will.”
Jamal Ahmidan
was a principal behind the 3/11 attacks in Spain that left 2000 dead or injured and Spain with a new government that retreated out of Iraq.
“We change states, we destroy others with Allah’s help and even decide the future of the world’s economy. We won’t accept being mere passive agents in this world.”
writes for the Samizdata blog.
“We are in the middle, not merely of a war in Iraq, but of a global war on whose outcome our very lives may depend. I am too close to technology not to realize how much evil can be done by a small number of dedicated followers of the dark side.”
was recently advocacy director for the Singularity Institute for Artificial Intelligence. He is a member of our Advisory Board.
“The arrival of nanotechnology will herald a mess of totally unmanageable difficulties. Human intelligence and ethics are not enough to handle these challenges. Without smarter-than-human, kinder-than-human forms of intelligence to assist us in confronting these grave difficulties, our continued survival cannot be ensured.”
“If my million dollars can avert the chance of existential disaster by, say, 0.0001%, then the expected utility of this action relative to the expected utility of life extension advocacy is shocking. That’s 0.0001% of the utility of quadrillions or more humans, transhumans, and posthumans leading fulfilling lives. I’ll spare the reader from working out the math and utility curves — I’m sure you can imagine them. So, why is it that people tend to devote more resources to life extension than risk prevention? [My guesses are]:
- They estimate the probability of any risk occurring to be extremely low.
- They estimate their potential influence over the likelihood of risk to be extremely low.
- They feel that positive PR towards any futurist goals will eventually result in higher awareness of risk.
- They fear social ostracization if they focus on ‘Doomsday scenarios’ rather than traditional extension.”
“If an existential disaster occurs, not only will the possibilities of extreme life extension, sophisticated nanotechnology, intelligence enhancement, and space expansion never bear fruit, but everyone will be dead, never to come back. Because we have so much to lose, existential risk is worth worrying about even if our estimated probability of occurrence is extremely low.
It is not the funding of life extension research projects that immortalists should be focusing on. It should be projects that decrease existential risk. By default, once the probability of existential risk is minimized, life extension technologies will be developed and applied. There are powerful economic and social imperatives in that direction, but few towards risk management. Existential risk creates a ‘loafer problem’ — we always expect someone else to do it. I assert that this is a dangerous strategy and should be discarded in favor of making prevention of such risks a central focus.”
Abdullah Ahmad Badawi
Prime Minster of Malaysia and Chairman of the 57-nation Organization of the Islamic Conference.
“The whole world is getting very disturbed. The frequency (of terrorist attacks) seems to be mounting.”
founder, president, chief executive officer (CEO), and chairman of the board of Amazon.com.
“For better or worse, it is really not a part of our culture to look at things defensively. We rarely say, ‘Oh my God, we’ve got to do something about that existential threat.’ Maybe one day we’ll become extinct because of that deficiency in our nature.”
Director and Chief Economist of the U.S. Cyber Consequences Unit, a Department of Homeland Security advisory group and also a member of our Scientific Advisory Board.
“My biggest obstacle is people’s unrealistic belief that if a given disaster hasn’t happened yet, it won’t ever happen.”
winner of a Templeton Foundation grant, cofounder of The World Transhumanist Association, and is director of the Future of Humanity Institute at the University of Oxford.
“At the present rate of scientific and technological progress, there is a real chance that we will have molecular manufacturing or superhuman artificial intelligence well within the first half of this century. Now, this creates some considerable promises and dangers. In a worst-case scenario, intelligent life could go extinct.”
“For example, if someone thought that a century-long ban on new technology were the only way to avoid a nanotechnological doomsday, she could still classify as a transhumanist, provided her opinion did not stem from a general technophobia … but was the result of a rational deliberation of the likely consequences of the possible policies.”
“The technology to produce a destructive nanobot seems considerably easier to develop than the technology to create an effective defense against such an attack (a global nanotech immune system, an ‘active shield’).”
“Our approach to existential risks cannot be one of trial-and-error. There is no opportunity to learn from errors. The reactive approach — see what happens, limit damages, and learn from experience — is unworkable. Rather, we must take a proactive approach. This requires foresight to anticipate new types of threats and a willingness to take decisive preventive action and to bear the costs (moral and economic) of such actions.”
“The Fermi Paradox refers to the question mark that hovers over the data point that we have seen no signs of extraterrestrial life. This tells us that it is not the case that life evolves on a significant fraction of Earth-like planets and proceeds to develop advanced technology, using it to colonize the universe in ways that would have been detected with our current instrumentation. There must be (at least) one Great Filter — an evolutionary step that is extremely improbable — somewhere on the line between Earth-like planet and colonizing-in-detectable-ways civilization. If the Great Filter isn’t in our past, we must fear it in our (near) future. Maybe nearly every civilization that develops a certain level of technology causes its own extinction.”
“It is clear from this section that we are rapidly inventing new ways of destroying ourselves, and that the risk to mankind is increasing exponentially.” As early as 2005, there will be a “deliberate biotech self-destruct by a malicious biotech researcher” and “terrorism will rise beyond the capability of government systems.”
best selling author of
Straits Of Power
,
Tidal Rip
,
Crush Depth
,
Thunder in the Deep
, and
Deep Sound Channel
. He is a regular columnist for military.com and is the winner of the 1999 and 2000 Literary Awards from the Naval Submarine League.
“Some nut-case getting hold of reprogrammable self-replicating nano-bots, originally developed by mainstream experts for benevolent uses, is indeed a very scary thought… As I’m sure you know, sad experience has shown that often people who commit suicide, or arson for revenge, or similar violent acts, have no notion of the wider and lasting implications of their impulsive behavior. So yes, I certainly agree, some loony with the access to the tech, and the fleeting moment of irrational motive, could indeed precipitate a gigantic state-change of the sort considered in the math of Catastrophe Theory, far beyond the bad actor’s expectation or comprehension. Yikes!”
Warren Buffett
our 2002 Guardian Award winner, is the world’s second wealthiest man, who is known as the ‘Oracle of Omaha’ for his astute investments.
“Predicting rain doesn’t count, building arks does.” “Fear may recede with time, but the danger won’t — the war against terrorism can never be won.”
“We’re going to have something in the way of a major nuclear event in this country. It will happen. Whether it will happen in 10 years or 10 minutes, or 50 years… it’s virtually a certainty.”
“We would regard ourselves as vulnerable to extinction as a company if we did not have nuclear, biological and chemical risks excluded from our policies.”
cofounder of the Alliance to Rescue Civilization (ARC) and a member of our Scientific Advisory Board.
“The question to ask is whether the risk of traveling to space is worth the benefit. The answer is an unequivocal yes, but not only for the reasons that are usually touted by the space community: the need to explore, the scientific return, and the possibility of commercial profit. The most compelling reason, a very long-term one, is the necessity of using space to protect Earth and guarantee the survival of humanity.”
George W. Bush
U.S. President.
“Our generation faces new and grave threats to liberty, to the safety of our people and to civilization itself. We face an aggressive force that glorifies death, that targets the innocent and seeks the means to murder on a massive scale.” “Wishful thinking might bring comfort, but not security.”
“The gravest danger… lies at the perilous crossroads of radicalism and technology.”
“I don’t think you can win [the war on terror].”
involved in launching rockets for the past 25 years, is cofounder of Celestis, Inc., and is a member of our Scientific Advisory Board.
“If we get a second toehold in the solar system in the next 100 years, we will have gone a long ways toward ensuring the long-term viability of the human species.”
Arthur C. Clarke
prophetic SF author who in 1945 predicted a world linked by geostationary satellites.
“This terrorism is a frightful danger and it is hard to see how we can get complete protection from it”.
Michael Crichton
was author of The Andromeda Strain, Jurassic Park, and
Prey
. He was also the creator of the television series ER.
“Sometime in the twenty-first century, our self-deluded recklessness will collide with our growing technological power. One area where this will occur is in the meeting point of nanotechnology, biotechnology, and computer technology. What all three have in common is the ability to release self-replicating entities into the environment.” “Nobody does anything until it’s too late. We put the stoplight at the intersection after the kid is killed.” “‘They didn’t understand what they were doing.’ I’m afraid that will be on the tombstone of the human race.”
War Diary is included in the US Library of Congress historic collection of 2003 War on Iraq on Internet. This online news source contains in-depth coverage of terrorism, security, political analysis, and espionage and is available in English and Hebrew.
“While the Americans focus on their war against insurgents in Iraq and the Israelis are caught up in fighting Palestinian terrorists, Al Qaeda is drawing a ring of fire around both.”
Retired Army General Wayne A. Downing
was U.S. President George W. Bush’s deputy national security adviser for counterterrorism until July 8, 2002.
“The United States may have to declare martial law someday in the case of a devastating attack with weapons of mass destruction causing tens of thousands of casualties.” “Most sobering to me was [the terrorists] research on chemical weapons, radiological dispersion devices, and their fascination with nuclear weapons. They are obsessed with them.”
Eric Drexler
founder of the Foresight Institute, and founder of the nanotechnology movement.
“Foresight’s concern for the long-term potential abuse of nanotechnology has been confirmed and strengthened. Those who abuse technology — from airliners to anthrax — for destructive ends do exist and are unlikely to stop before full nanotech arrives, with all its power for both good and ill.” “It would be easy to say, ‘let government or industry figure out how to prevent nanotech misuse,’ but the events of Sept. 11 and afterwards show this to be naive. (The current attempt to make airliners safer by keeping all sharp objects off the plane is laughable — a pair of glass eyeglasses is easily broken and used instead. The authorities dealing with the anthrax attacks expressed surprise that anthrax could leak from “sealed” envelopes — when anyone who’s ever licked one can see that the adhesive doesn’t extend to the flap’s edges.) Outside perhaps the military, government doesn’t do too well at anticipating emergencies and planning policies for them — their incentives are too political, and their time horizons are too short…”
“If extraterrestrial civilizations exist, and if even a small fraction were to behave as all life on Earth does, then they should by now have spread across space.” “By now, after hundreds of millions of years, even widely scattered civilizations would have spread far enough to meet each other, dividing all of space among them.” “An advanced civilization pushing its ecological limits would, almost by definition, not waste both matter and energy. Yet we see such waste in all directions, as far as we can see spiral galaxies: their spiral arms hold dust clouds made of wasted matter, backlit by wasted starlight… The idea that humanity is alone in the visible universe is consistent with what we see in the sky… Thus for now, and perhaps forever, we can make plans for our future without concern for limits imposed by other civilizations.”
was research scientist at Zyvex LLC, the Earth’s first molecular nanotechnology company and is the author of
Nanomedicine
, the first book-length technical discussion of the medical applications of nanotechnology and medical nanorobotics. He is a 2006 Lifeboat Foundation Guardian Award winner and a member of our Scientific Advisory Board.
“Specific public policy recommendations suggested by the results of the present analysis include: An immediate international moratorium on all artificial life experiments implemented as nonbiological hardware. In this context, ‘artificial life’ is defined as autonomous foraging replicators, excluding purely biological implementations (already covered by NIH guidelines tacitly accepted worldwide) and also excluding software simulations which are essential preparatory work and should continue. Alternative ‘inherently safe’ replication strategies such as the broadcast architecture are already well-known.”
Bill First
U.S. Senate Majority Leader.
“Like everyone else, politicians tend to look away from danger, to hope for the best, and pray that disaster will not arrive on their watch even as they sleep through it. This is so much a part of human nature that it often goes unchallenged. But we will not be able to sleep through what is likely coming soon — a front of unchecked and virulent epidemics, the potential of which should rise above your every other concern. For what the world now faces, it has not seen even in the most harrowing episodes of the Middle Ages or the great wars of the last century… No intelligence agency, no matter how astute, and no military, no matter how powerful and dedicated, can assure that a few technicians of middling skill using a few thousand dollars worth of readily available equipment in a small and apparently innocuous setting cannot mount a first-order biological attack. It’s possible today to synthesize virulent pathogens from scratch, or to engineer and manufacture prions that, introduced undetectably over time into a nation’s food supply, would after a long delay afflict millions with a terrible and often fatal disease. It’s a new world… So what must we do? I propose an unprecedented effort — a “Manhattan Project for the 21st Century” — not with the goal of creating a destructive new weapon, but to defend against destruction wreaked by infectious disease and biological weapons… This is a bold vision. But it is the kind of thing that, once accomplished, is done. And it is the kind of thing that calls out to be done — and that, if not done, will indict us forever in the eyes of history. In diverting a portion of our vast resources to protect nothing less than our lives, the lives of our children, and the life of our civilization, many benefits other than survival would follow in train — not least the satisfaction of having done right.”
cofounder of Microsoft, which became the world’s largest PC software company. He is also the 2015 Lifeboat Foundation Guardian Award winner.
“This is like earthquakes, you should think in order of magnitudes. If you can kill 10 people that’s a 1, 100 people that’s a two… Bioterrorism is the thing that can give you not just sixes, but sevens, eights, and nines. With nuclear war, once you have got a six, or a seven, or eight, you’d think it would probably stop. With bioterrorism it’s just unbounded if you are not there to stop the spread of it.”
“I am in the camp that is concerned about super intelligence. First the machines will do a lot of jobs for us and not be super intelligent. That should be positive if we manage it well. A few decades after that though the intelligence is strong enough to be a concern. I agree with Elon Musk and some others on this and don’t understand why some people are not concerned.”
Rudolph Giuliani
mayor of New York City when it was attacked on 9/11.
“The most dangerous situation is where you’re facing peril but you’re not aware of it.”
Professor of Biomaterials, Fierer Chair of Molecular Cell Biology, and Biomedical Materials Engineering and Science Program Chair at Alfred University and is a member of our Scientific Advisory Board.
“…because of nanobiotechnology, we have never been closer to a Grey Goo scenario.”
NASA Administrator Michael D. Griffin
“But the goal isn’t just scientific exploration… it’s also about extending the range of human habitat out from Earth into the solar system as we go forward in time… In the long run a single-planet species will not survive. We have ample evidence of that.”
Julian Haight
president of SpamCop.net, the premier spam reporting service.
“I’m getting bombed off the face of the Earth and no one cares.”
Stephen Hawking
was a famous cosmologist who discovered that black holes are not completely black, but emit radiation and eventually evaporate and disappear.
“It is important for the human race to spread out into space for the survival of the species. Life on Earth is at the ever-increasing risk of being wiped out by a disaster, such as sudden global warming, nuclear war, a genetically engineered virus or other dangers we have not yet thought of.”
“In the long term, I am more worried about biology. Nuclear weapons need large facilities, but genetic engineering can be done in a small lab. You can’t regulate every lab in the world. The danger is that either by accident or design, we create a virus that destroys us.” “I don’t think the human race will survive the next thousand years, unless we spread into space. There are too many accidents that can befall life on a single planet.”
Robert A. Heinlein
was an influential and controversial science fiction author. The English language absorbed several words from his fiction, including “grok”, meaning “to understand so thoroughly that the observer becomes part of the observed.”
“The Earth is just too small and fragile a basket for the human race to keep all its eggs in.”
author, public speaker, social innovator and President of the Foundation for Conscious Evolution. She is also a member of our Advisory Board.
“If Earth is considered a closed system, there will be less for all forever. The frontier is closed, the wilderness is gone, nature is being destroyed by human consumers, while billions are starving. The future indeed looks grim, and there are, ultimately, no really long-range, positive solutions, nor motivation for making the sacrifices and doing the hard work needed now, unless we understand that we are evolving from an Earth-only toward an Earth-space or universal species.”
Admiral David E. Jeremiah
US Navy (Ret.), Former Vice Chairman, Joint Chiefs of Staff
“Somewhere in the back of my mind I still have this picture of five smart guys from Somalia or some other nondeveloped nation who see the opportunity to change the world. To turn the world upside down. Military applications of molecular manufacturing have even greater potential than nuclear weapons to radically change the balance of power.”
Bill Joy
“Edison of the Internet” is inventor of the Unix word processor vi, cofounder of Sun Microsystems, and a 2006 Lifeboat Foundation Guardian Award winner.
“Hope is a lousy defense.”
“We are being propelled into this new century with no plan, no control, no brakes.” “But many other people who know about the dangers still seem strangely silent. When pressed, they trot out the ‘this is nothing new’ riposte — as if awareness of what could happen is response enough.” “I think it is no exaggeration to say we are on the cusp of the further perfection of extreme evil, an evil whose possibility spreads well beyond that which weapons of mass destruction bequeathed to the nation-states, on to a surprising and terrible empowerment of extreme individuals.” “An immediate consequence of the Faustian bargain in obtaining the great power of nanotechnology is that we run a grave risk – the risk that we might destroy the biosphere on which all life depends.” “…if our own extinction is a likely, or even possible, outcome of our technological development, shouldn’t we proceed with great caution?”
co-creator of string field theory.
“Of all the generations of humans that have walked the surface of the Earth — for 100,000 years, going back when we first left Africa — the generation now alive is the most important. The generation now alive, the generation that you see, looking around you, for the first time in history, is the generation that controls the destiny of the planet itself.”
chairman of the United Civil Front, a democratic activist group based in Russia. He was the world chess champion for over 20 years.
“My matches against generations of chess computers made it painfully clear to me that the march of technology cannot be stopped. The lucky moment we have inhabited, in which weapons of mass destruction (WMD) are prohibitively expensive and difficult to manufacture, is rapidly coming to an end.”
Mickey Kaus
author of the blog Kausfiles published in Microsoft’s Slate magazine, authored the book
The End of Equality
.
“Mass-destructive terrorism in the near future won’t even come from Al Qaeda, much less from nation states — but rather from small groups of highly motivated causists or even from loners.”
“I’m especially not persuaded, for example, that when technology puts greater and greater destructive power into the hands of smaller and smaller numbers of individuals it won’t ultimately lead to some sort of doom. Imagine a rowboat with ten people, of varying religious beliefs, all of whom have their fingers on the trigger of a personal nuclear device. They try to get along and run a little society. How many times will this scenario result in a big explosion? More often than not, I suspect.”
Ed Koch
former Mayor of New York City.
“I believe that the U.S. is faltering in the current war against international terrorism, and we are losing our will to prevail.”
Charles Krauthammer
syndicated columnist who appears in the Washington Post and other publications and commentator on various TV programs. He earned his M.D. from Harvard University’s medical school in 1975 and won the Pulitzer Prize in 1987.
“Resurrection of the [1918 Flu] virus and publication of its structure open the gates of hell. Anybody, bad guys included, can now create it. Biological knowledge is far easier to acquire for Osama bin Laden and friends than nuclear knowledge. And if you can’t make this stuff yourself, you can simply order up DNA sequences from commercial laboratories around the world that will make it and ship it to you on demand… And if the bad guys can’t make the flu themselves, they could try to steal it. That’s not easy. But the incentive to do so from a secure facility could not be greater. Nature, which published the full genome sequence, cites Rutgers bacteriologist Richard Ebright as warning that there is a significant risk “verging on inevitability” of accidental release into the human population or of theft by a ‘disgruntled, disturbed or extremist laboratory employee.’ Why try to steal loose nukes in Russia? A nuke can only destroy a city. The flu virus, properly evolved, is potentially a destroyer of civilizations. We might have just given it to our enemies. Have a nice day.”
prophetic author of the 1990 book
The Age of Intelligent Machines
where he correctly predicted advancements in AI. He was also the principal developer of the first omni-font optical character recognition, the first print-to-speech reading machine for the blind, the first CCD flat-bed scanner, and the first commercially marketed large-vocabulary speech recognition. He is a member of the U.S. Army Science Advisory Group, our 2005 Guardian Award winner, and is on our Scientific Advisory Board.
“…the means and knowledge will soon exist in a routine college bioengineering lab (and already exists in more sophisticated labs) to create unfriendly pathogens more dangerous than nuclear weapons.”
“I advocate a one hundred billion dollar program to accelerate the development of technologies to combat biological viruses.”
“We have an existential threat now in the form of the possibility of a bioengineered malevolent biological virus. With all the talk of bioterrorism, the possibility of a bioengineered bioterrorism agent gets little and inadequate attention. The tools and knowledge to create a bioengineered pathogen are more widespread than the tools and knowledge to create an atomic weapon, yet it could be far more destructive. I’m on the Army Science Advisory Group (a board of five people who advise the Army on science and technology), and the Army is the institution responsible for the nation’s bioterrorism protection. Without revealing anything confidential, I can say that there is acute awareness of these dangers, but there is neither the funding nor national priority to address them in an adequate way.”
“The decision by the U.S. Department of Health & Human Services to publish the full genome of the 1918 influenza virus on the Internet in the GenBank database is extremely dangerous and immediate steps should be taken to remove this data.”
“Grey goo certainly represents power — destructive power — and if such an existential threat were to prevail, it would represent a catastrophic loss… Although the existential nanotechnology danger is not yet at hand, denial is not the appropriate strategy.”
“A self-replicating pathogen, whether biological or nanotechnology based, could destroy our civilization in a matter of days or weeks.” “We can envision a more insidious possibility. In a two-phased attack, the nanobots take several weeks to spread throughout the biomass but use up an insignificant portion of the carbon atoms, say one out of every thousand trillion (1015). At this extremely low level of concentration, the nanobots would be as stealthy as possible. Then, at an ‘optimal’ point, the second phase would begin with the seed nanobots expanding rapidly in place to destroy the biomass. For each seed nanobot to multiply itself a thousand trillionfold would require only about 50 binary replications, or about 90 minutes.” “Recall that biological evolution is measured in millions and billions of years. So if there are other civilizations out there, they would be spread out in terms of development by huge spans of time. The SETI assumption implies that there should be billions of ETIs (among all the galaxies), so there should be billions that lie far ahead of us in their technological progress. Yet it takes only a few centuries at most from the advent of computation for such civilizations to expand outward at at least light speed. Given this, how can it be that we have not noticed them? The conclusion I reach is that it is likely (although not certain) that there are no such other civilizations.” “To this day, I remain convinced of this basic philosophy: no matter what quandries we face — business problems, health issues, relationship difficulties, as well as the great scientific, social, and cultural challenges of our time — there is an idea that can enable us to prevail. Furthermore, we can find that idea. And when we find it, we need to implement it. My life has been shaped by this imperative. The power of an idea — this is itself an idea.”
author of
The End of the World: The Science and Ethics of Human Extinction
and a member of our Scientific Advisory Board.
“Our failure to detect intelligent extraterrestrials may indicate not so much how rarely these have evolved, but rather how rapidly they have destroyed themselves after developing technological civilizations.” “What is surprising is that so little has been done to develop Earth-based artificial biospheres… If one-hundredth as much had been spent on developing artificial biospheres as on making nuclear weapons, a lengthy future for humankind might by now be virtually assured.”
Ken Livingstone
mayor of London, said the following after Al Qaeda attacked Spain (and before they attacked London in the worst attack on London since World War II).
“It would be miraculous if, with all the terrorist resources arranged against us, terrorists did not get through, and given that some are prepared to give their own lives, it would be inconceivable that someone does not get through to London.”
Head Senior Researcher, Neural Information Processing Group, Eötvös Loránd University Budapest, Hungary is a member of our Scientific Advisory Board.
“I subsign the following opinion: The future and well-being of the Nation depend on the effective integration of Information Technologies into its various enterprises, and social fabric. Information Technologies are designed, used, and have consequences in a number of social, economic, legal, ethical, and cultural contexts. With the rise of unprecedented new technologies … and their increasing ubiquity in our social and economic lives, large-scale social, economic, and scientific transformations are predicted. While these transformations are expected to be positive … there is general agreement among leading researchers that we have insufficient scientific understanding of the actual scope and trajectory of these socio-technical transformations.”
Richard G. Lugar
United States Senator for the state of Indiana. He is also the U.S. Senate Foreign Relations Committee Chairman.
“Even if we succeed spectacularly at building democracy around the world, bringing stability to failed states and spreading economic opportunity broadly, we will not be secure from the actions of small, disaffected groups that acquire weapons of mass destruction.”
Director of Center for Materials Research at Washington State University. Dr. Lynn has developed an “antimatter trap” that the U.S. Air Force is considering as the basis of an antimatter bomb which would be over 1,000 times as powerful as an H-bomb.
“I think we need to get off this planet, because I’m afraid we’re going to destroy it.”
author of
Nanosecurity and the Future (if Any)
, and a member of our Scientific Advisory Board.
“It would seem wise to locate the initial nanolabs in remote locations, and to equip each with a sizable and immovable fusion warhead designed to detonate upon notification of a nanoevent. To prevent the warhead itself from being disassembled before notification can be sent or received, redundant backup detonation procedures are called for. The weapon could, for example, be placed in a vacuum which, if broken, initiates detonation. Alternatively, the weapon could be suspended in a fluid whose volume must remain constant, under pressure which must remain unaltered, within an electromagnetic field which must be maintained, etc. A combination of such measures — the violation of any one of which along will trigger detonation — would perhaps be wisest. Manual detonation might also be permitted.”
MIT Technology Review
“There is growing scientific consensus that biotechnology — especially, the technology to synthesize ever larger DNA sequences — has advanced to the point that terrorists and rogue states could engineer dangerous novel pathogens.”
often likened to a real-life Tony Stark from Marvel’s Iron Man comics for his role in cutting-edge companies including SpaceX, a private space exploration company that holds the first private contracts from NASA for resupply of the International Space Station, and the electric car company Tesla Motors. Watch Elon in Iron Man 2! He is winner of the 2014 Guardian Award.
I think we should be very careful about artificial intelligence. If I had to guess at what our biggest existential threat is, it’s probably that. So we need to be very careful.
If you look at our current technology level, something strange has to happen to civilizations, and I mean strange in a bad way. And it could be that there are a whole lot of dead, one-planet civilizations.
Sooner or later, we must expand life beyond this green and blue ball — or go extinct.
cowrote
The Dark Knight
,
The Dark Knight Rises
, and
Interstellar
. He is co-creator of
Westworld
and creator of
Person of Interest
.
“You have these companies — Facebook, Google — barreling toward AI with zero accountability, because it services their corporate mandate… if they’re taking the same approach toward AI that they’re taking toward their responsibilities in social media, we’re fucked.”
Peggy Noonan
contributing editor of The Wall Street Journal and author of
A Heart, a Cross, and a Flag
.
“Man has never devised a weapon he hasn’t eventually used.”
“We are up against not an organized state monolith but dozens, hundreds and thousands of state and nonstate actors — nuts with nukes, freelance bioterrorists, Islamofascists, independent but allied terror groups. The temperature of our world is very high.”
David R. Obey
U.S. House of Representatives (Democrat – Wisconsin).
“Obviously if there’s an attack in ports, you could have hundreds of thousands of people die, depending on the weapons used, and there certainly is a colossal risk to the economy.”
physician and director of the Center for Biosecurity at the University of Pittsburgh Medical Center.
“It is true that pandemic flu is important, and we’re not doing nearly enough, but I don’t think pandemic flu could take down the United States of America. A campaign of moderate biological attacks could.”
in-house futurologist for Futurizon, and advisor on our Scientific Advisory Board.
“In 1900 there were only a few ways for the planet to be wiped out: comet, disease etc. But in the last few decades we have amassed a whole plethora of possibilities: nuclear, environmental, biological, and a lot of future threats will come from computing.” “We’ve managed to get ourselves into a position where the statistical chances of extinction will soon exceed one percent [per year]. It means that sometime in the next 100 years the human race will be wiped out somehow.” “Given this and the rate of technological advancement, I think the human race could be extinct within the next 30 to 40 years.”
Chris Phoenix
cofounder of the Center for Responsible Nanotechnology.
“I don’t see much hope if we enter the future unprepared.”
fellow at the New America Foundation, a columnist for Newsday and TechCentralStation.com and a contributor to the Fox News Channel. He authored
What Comes Next: The End of Big Government and the New Paradigm Ahead
and is a member of our Scientific Advisory Board.
“But the continuing advance of technology has brought a new dilemma: Increasingly, any single individual or small group can wield great destructive power. If one were to draw a line over the course of history, from the first tomahawk, through the invention of gunpowder, all the way to the A-bomb, one would see a steeply upsloping curve.” “Thanks to computers, that upslope is likely to stay steep for a long time to come, as artificial brain power doubles and redoubles. Techno-progress will be spread out across the full spectrum of human activity, but if history is any guide, then much ‘progress’ will come in the form of more lethal weapons, including nano-weapons. Thus, the ‘suitcase nuke’ that we fear today could be superseded by future mass-killers that fit inside a thimble — or a single strand of DNA.”
Royal Society Professor at Cambridge University, a Fellow of Kings College, and the U.K.’s Astronomer Royal. The winner of the 2001 Cosmology Prize of the Peter Gruber Foundation and our 2004 Guardian Award, he has published numerous academic papers and books including
Our Final Hour: A Scientist’s Warning: How Terror, Error, and Environmental Disaster Threaten Humankind’s Future In This Century — On Earth and Beyond
. He is a member of our Scientific Advisory Board.
“Science is advancing faster than ever, and on a broader front… But there is a dark side: new science can have unintended consequences; it empowers individuals to perpetrate acts of megaterror; even innocent errors could be catastrophic. The ‘downside’ from twenty-first century technology could be graver and more intractable than the threat of nuclear devastation that we have faced for decades.” “If there were millions of independent fingers on the button of a Doomsday machine, then one person’s act of irrationality, or even one person’s error, could do us all in.”
“Biotechnology is advancing rapidly, and by 2020 there will be thousands — even millions — of people with the capability to cause a catastrophic biological disaster. My concern is not only organized terrorist groups, but individual weirdos with the mindset of the people who now design computer viruses. Even if all nations impose effective regulations on potentially dangerous technologies, the chance of an active enforcement seems to me as small as in the case of the drug laws.”
“We can ask of any innovation whether its potential is so scary that we should be inhibited in pressing on with it, or at least impose some constraints. Nanotechnology, for instance, is likely to transform medicine, computers, surveillance, and other practical areas, but it might advance to a stage at which a replicator, with its associated dangers, became technically feasible. There would then be the risk, as there now is with biotechnology, of a catastrophic ‘release’ (or that the technique could be used as a ‘suicide weapon’).”
“To put effective brakes on a field of research would require international consensus. If one country alone imposed regulations, the most dynamic researchers and enterprising companies would simply move to another country, something that is happening already in stem cell research. And even if all governments agreed to halt research in a particular field, the chances of effective enforcement are slim.”
“Even if all the world’s scientific academics agreed that some specific lines of inquiry had a disquieting ‘downside’ and all countries, in unison, imposed a formal prohibition, then how effectively could it be enforced? An international moratorium could certainly slow down particular lines of research, even if they couldn’t be stopped completely. When experiments are disallowed for ethical reasons, enforcement with ninety-nine percent effectiveness, or even just ninety percent, is far better than having no prohibition at all; but when experiments are exceedingly risky, enforcement would need to be close to one hundred percent effective to be reassuring: even one release of a lethal virus could be catastrophic, as could a nanotechnology disaster. Despite all the efforts of law enforcers, millions of people use illicit drugs; thousands peddle them. In view of the failure to control drug smuggling or homicides, it is unrealistic to expect that when the genie is out of the bottle, we can ever be fully secure against bioerror and bioterror: risk would still remain that could not be eliminated except by measures that are themselves unpalatable, such as intrusive universal surveillance.” “It is not inconceivable that physics could be dangerous too. Some experiments are designed to generate conditions more extreme than ever occur naturally. Nobody then knows exactly what will happen. Indeed, there would be no point in doing any experiments if their outcomes could be fully predicted in advance. Some theorists have conjectured that certain types of experiment could conceivably unleash a runaway process that destroyed not just us but Earth itself.” “More ominously, there could be a crucial hurdle at our own present evolutionary stage, the state when intelligent life starts to develop technology. If so, the future development of life depends on whether humans survive this phase.” “Suppose that we had a fateful decision that would determine whether the species might soon be extinguished, or else whether it would survive almost indefinitely. For instance, this might be the choice of whether to foster the first community away from Earth, which, once established, would spawn so many others that one would be guaranteed to survive.” “Even a few pioneering groups, living independently of Earth, would offer a safeguard against the worst possible disaster — the foreclosure of intelligent life’s future through the extinction of all humankind. The ever-present slight risk of a global catastrophe with a ‘natural’ cause will be greatly augmented by the risks stemming from twenty-first-century technology. Humankind will remain vulnerable so long as it stays confined here on Earth. Is it worth insuring against not just natural disasters by the probably much larger (and certainly growing) risk of human-induced catastrophes? Once self-sustaining communities exist away from Earth — on the Moon, on Mars, or freely floating in space — our species would be invulnerable to even the worst global disasters.” “Once the threshold is crossed when there is a self-sustaining level of life in space, then life’s long-range future will be secure irrespective of any of the risks on Earth. Will this happen before our technical civilization disintegrates, leaving this as a might-have-been? Will the self-sustaining space communities be established before a catastrophe sets back the prospect of any such enterprise, perhaps foreclosing it forever? We live at what could be a defining moment for the cosmos, not just for our Earth.” “What happens here on Earth, in this century, could conceivably make the difference between a near eternity filled with ever more complex and subtle forms of life and one filled with nothing but base matter.”
John Reid
Home Secretary for the United Kingdom.
“We are probably in the most sustained period of severe threat since the end of World War II. While I am confident that the security services and police will deliver 100% effort and 100% dedication, they cannot guarantee 100% success. Our security forces and the apparatus of the state provide a very necessary condition for defeating terrorism but can never be sufficient to do so on their own. Our common security will only be assured by a common effort from all sections of society.”
Founding Executive Partner of Sophos Partners, LLC.
“There’s just no way to guarantee human survival unless we move off this planet.”
Glenn Reynolds
contributing editor of Tech Central Station where his special feature on technology and public policy called “Reynolds’ Wrap” appears each week. He is also the creator of the popular blog Instapundit and author of
An Army of Davids : How Markets and Technology Empower Ordinary People to Beat Big Media, Big Government, and Other Goliaths
.
“Stephen Hawking says that humanity won’t survive the next thousand years unless we colonize space. I think that Hawking is an optimist.”
“Most people — and politicians are worse, if anything — have short time horizons. Disasters are things that just don’t happen, until they do. Planning for them is ignored, or even looked down on, often by the very same people who are making after-the-fact criticisms that there wasn’t enough planning.”
“Over the long term, by which I mean the next century, not the next millennium, disaster may hold the edge over prevention: a nasty biological agent only has to get out once to devastate humanity, no matter how many times other such agents were contained previously. In the short term, prevention and defense strategies make sense. But such strategies take you only so far. As Robert Heinlein once said, Earth is too fragile a basket to hold all of our eggs. We need to diversify, to create more baskets. Colonies on the moon, on Mars, in orbit, perhaps on asteroids and beyond…”
Condoleezza Rice
U.S. Secretary of State.
“The phenomenon of weak and failing states is not new, but the danger they now pose is unparalleled. When people, goods and information traverse the globe as fast as they do today, transnational threats such as disease or terrorism can inflict damage comparable to the standing armies of nation-states… Weak and failing states serve as global pathways that facilitate the spread of pandemics, the movement of criminals and terrorists, and the proliferation of the world’s most dangerous weapons.”
Tom Ridge
first U.S. Homeland Security Director.
“The general theme of it’s-not-a-matter-of-if-but-when is legitimate.”
Donald H. Rumsfeld
U.S. Secretary of Defense.
“It is inevitable that terrorists will obtain weapons of mass destruction, and that they will use them against us.”
Carl Sagan
American astronomer, planetologist, biologist, and popularizer of science and space research.
“All civilizations become either spacefaring or extinct.”
Marshall T. Savage
author of
The Millennial Project: Colonizing The Galaxy In Eight Easy Steps
.
“Perhaps advanced civilizations don’t use radio, or radar, or microwaves. Advanced technology can be invoked as an explanation for the absence of extra terrestrial radio signals. But is seems unlikely that their technology would leave no imprint anywhere in the electromagnetic spectrum. We have been compared to the aborigine who remains blissfully unaware of the storm of radio and TV saturating the airwaves around him. Presumably, the aliens use advanced means of communications which we cannot detect. What these means might be is, by definition, unknown, but they must be extremely exotic. We don’t detect K2 signals in the form of laser pulses, gamma rays, cosmic rays, or even neutrinos. Therefore the aliens must use some system we haven’t even imagined. This argument, appealing though it is, cannot survive contact with Occam’s razor — in this case Occam’s machete. The evidence in hand is simply nothing — no signals. To explain the absence of signals in the presence of aliens, demands recourse to what is essentially magic. Unfortunately, the iron laws of logic demand that we reject such wishful thinking in favor of the simplest explanation which fits the data: No signals; no aliens. The skies are thunderous in their silence; the Moon eloquent in its blankness; the aliens are conclusive in their absence. The extraterrestrials aren’t here. They’ve never been here. They’re never coming here. They aren’t coming because they don’t exist. We are alone.” “Now is the watershed of Cosmic history. We stand at the threshold of the New Millennium. Behind us yawn the chasms of the primordial past, when this universe was a dead and silent place; before us rise the broad sunlit uplands of a living cosmos. In the next few galactic seconds, the fate of the universe will be decided. Life — the ultimate experiment — will either explode into space and engulf the star-clouds in a fire storm of children, trees, and butterfly wings; or Life will fail, fizzle, and gutter out, leaving the universe shrouded forever in impenetrable blankness, devoid of hope. Teetering here on the fulcrum of destiny stands our own bemused species. The future of the universe hinges on what we do next. If we take up the sacred fire, and stride forth into space as the torchbearers of Life, this universe will be aborning. If we carry the green fire-brand from star to star, and ignite around each a conflagration of vitality, we can trigger a Universal metamorphosis. Because of us, the barren dusts of a million billion worlds will coil up into the pulsing magic forms of animate matter. Because of us, landscapes of radiation blasted waste, will be miraculously transmuted: Slag will become soil, grass will sprout, flowers will bloom, and forests will spring up in once sterile places. Ice, hard as iron, will melt and trickle into pools where starfish, anemones, and seashells dwell — a whole frozen universe will thaw and transmogrify, from howling desolation to blossoming paradise. Dust into Life; the very alchemy of God. If we deny our awesome challenge; turn our backs on the living universe, and forsake our cosmic destiny, we will commit a crime of unutterable magnitude. Mankind alone has the power to carry out this fundamental change in the universe. Our failure would lead to consequences unthinkable. This is perhaps the first and only chance the universe will ever have to awaken from its long night and live. We are the caretakers of this delicate spark of Life. To let it flicker and die through ignorance, neglect, or lack of imagination is a horror too great to contemplate.”
“the dean of Canadian science fiction” and consultant for the Canadian Federal Government’s Department of Justice to discuss what Canadian law should be in relation to biotechnology, stem-cell research, cloning, and the privacy of personal genetic information. He is a member of our Scientific Advisory Board.
“There’s a long-standing problem in astronomy called the Fermi Paradox, named for physicist Enrico Fermi who first proposed it in 1950. If the universe should be teeming with life, asked Fermi, then where are all the aliens? The question is even more vexing today: SETI, the search for extraterrestrial intelligence with radio telescopes, has utterly failed to turn up any sign of alien life forms. Why? One chillingly likely possibility is that, as the ability to wreak damage on a grand scale becomes more readily available to individuals, soon enough just one malcontent, or one lunatic, will be able to destroy an entire world. Perhaps countless alien civilizations have already been wiped out by single terrorists who’d been left alone to work unmonitored in their private laboratories.”
NATO Secretary General Jaap de Hoop Scheffer
“We have terrorism everywhere. There’s fights everywhere, be it here in this city (Istanbul), be it in New York, Uzbekistan, Mombasa, Yemen, you name it.”
Brad Sherman
U.S. House of Representatives (Democrat – California).
“This technology [nanotechnology] is every bit as explosive as nuclear weapons.”
founder of the branch of Artificial Intelligence based on machine learning, prediction, and probability. He was on our Scientific Advisory Board until his death.
“The Lifeboat problem becomes more and more critical as our technology ‘progresses’.”
offers comprehensive bite-size summaries of military news and affairs on the Internet. They provide inside data on how and why things happen.
“The Department of Homeland Security (DHS) initially sought to identify all the vulnerabilities to terrorism in the United States. Month by month, the list grew longer. It quickly became apparent that there would never be sufficient resources to defend against all these potential threats.”
An American astronomer best known for her work on the search for extraterrestrial intelligence (SETI). Jill is the former director of the Center for SETI Research, holding the Bernard M. Oliver Chair for SETI at the SETI Institute. In 2002, Discover magazine recognized her as one of the 50 most important women in science.
“Your organization is inspiring and essential for all life on this planet.”
American media visionary, philanthropist, and statesman.
“When people are moving too slowly to respond to a danger, one option is to make it more vivid. Seeing the danger is the first step to reducing the risk.”
“Hurricane Katrina drove home the staggering devastation that disasters — natural or man-made — can inflict. Meanwhile, July’s attacks on the London Underground reminded us terrorists can still strike major world cities. Now imagine the two joined together: terrorists, armed with weapons of mass destruction, unleashing Katrina-scale chaos and death in the heart of a U.S. city.” “The risk of a Katrina-scale terrorist attack with Russian weapons is too critical to tolerate any delays to these crucial efforts. Congress must act and free us to meet what President Bush calls ‘the greatest threat before humanity today’.”
Neil deGrasse Tyson
Chairman of the Board of The Planetary Society.
“If humans one day become extinct from a catastrophic collision, there would be no greater tragedy in the history of life in the universe. Not because we lacked the brain power to protect ourselves but because we lacked the foresight. The dominant species that replaces us in post-apocalyptic Earth just might wonder, as they gaze upon our mounted skeletons in their natural history musems, why large headed Homo sapiens fared no better than the proverbially pea-brained dinosaurs.”
Upon the authority of the charter granted to it by the Congress in 1863, the Academy has a mandate that requires it to advise the federal government on scientific and technical matters.
“Just a few individuals with specialized skills and access to a laboratory could inexpensively and easily produce a panoply of lethal biological weapons that might seriously threaten the US population. Moreover, they could manufacture such biological agents with commercially available equipment — that is, equipment that could also be used to make chemicals, pharmaceuticals, foods, or beer — and therefore remain inconspicuous.”
mathematician, computer scientist, and prophetic SF writer who predicted the Internet in 1981 and the Singularity in 1993.
“If the Singularity can not be prevented or confined, just how bad could the Post-Human era be? Well … pretty bad. The physical extinction of the human race is one possibility.”
“Epitaph: Foolish humans, never escaped Earth.”
authored the site Rationallink.org and was on our Scientific Advisory Board until his death.
“As an armchair exercise you can reflect on the hundreds of millions of years for intelligence to arise on Earth. And then, with agriculture and the rise of leisure and then science, comes the perennial struggle for power — for the dominion of one person over others — which seems an unavoidable consequence of intelligence. There results weapons of war and destruction and their proliferation and the possibility that one heedless maniac can destroy the entirety of civilization. The window of time, for the rise of science that can either destroy this world or undertake communication with other worlds, is minute — vanishingly small — compared to the time required for the rise of intelligence from the dust of creation. That window of time will likely forever bar communication and cooperation between worlds.”
White House
US National Security Council.
“We are menaced less by fleets and armies than by catastrophic technologies in the hands of the embittered few.”
White House official
speaking to the Washington Post.
“They are going to kill the White House. I have really begun to ask myself whether I want to continue to get up every day and come to work on this block.”
Bob Woodward
has authored or coauthored eight No. 1 national nonfiction bestsellers, including four books on the presidency.
“The realities at the beginning of the 21st century were two: the possibility of another massive, surprise terrorist attack similar to September 11, and the proliferation of weapons of mass destruction — biological, chemical or nuclear. Should the two converge in the hands of terrorists or a rogue state, the United States could be attacked and tens of thousands, even hundreds of thousands of people could be killed. In addition, the president and his team had found that protecting and sealing the U.S. homeland was basically impossible. Even with heightened security and the national terrorist alerts, the country was only marginally safer.”
Jonathan Zittrain
cofounded the Berkman Center for Internet and Society at Harvard Law School and holds the Chair in Internet Governance and Regulation at the University of Oxford.
“[The system functions as well as it does only because of] the forbearance of the virus authors themselves. With one or two additional lines of code…the viruses could wipe their hosts’ hard drives clean or quietly insinuate false data into spreadsheets or documents. Take any of the top ten viruses and add a bit of poison to them, and most of the world wakes up on a Tuesday morning unable to surf the Net — or finding much less there if it can.”