Toggle light / dark theme

Graduate student (University of Alabama Huntsville) Blake Anderton wrote his master’s thesis on “Application of Mode-locked lasers to asteroid characterization and mitigation.” Undergraduate Gordon Aiken won a prize at a recent student conference for his poster and presentation “Space positioned LIDAR system for characterization and mitigation of Near Earth Objects.” And members of the group are building a laser system “that is the grandfather of the laser that will push the asteroids,” Fork said.

Anderton’s mode locked lasers could characterize asteroids up to 1 AU away (1.5 x 10 to the 11 meters). Arecibo and other radar observatories can only detect objects up to 0.1 AU away, so in theory a laser would represent a vast improvement over radar.

A one page powerpoint describes their asteroid detection and deflection approach About 12 of the 1AU detection volumes (around the sun in the asteroid belt) would be needed to cover the main areas for near earth asteroids.

40KW femtosecond lasers could deflect an asteroid the size of Apophis (320meters, would hit with 880 megaton force) given one year of illumination and an early start in the trajectory.

Asteroid shields are a project of the Lifeboat Foundation

There are 67 kilowatt solid state lasers and modular laser systems & mirrors for reflecting lasers to achieve more laser power from smaller modules

A giant asteroid named Apophis has a one in 45,000 chance of hitting the Earth in 2036. If it did hit the earth it could destroy a city or a region. A slate of new proposals for addressing the asteroid menace was presented today at a recent meeting of the American Association for the Advancement of Science in San Francisco.

One of the Lifeboat Foundation projects is an Asteroid Shield and the issues and points discussed are in direct alignment with Lifeboat. The specific detection and deflection projects are in the Lifeboat Asteroid Shield project.

Edward Lu of NASA has proposed “gravitational tractor” is a spacecraft—up to 20 tons (18 metric tons)—that it could divert an asteroid’s path just by thrusting its engines in a specific direction while in the asteroid’s vicinity.

Scientists also described two massive new survey-telescope projects to detect would-be killer asteroids.

One, dubbed Pan-STARRS, is slated to begin operation later this year. The project will use an array of four 6-foot-wide (1.8-meter-wide) telescopes in Hawaii to scan the skies.

The other program, the Large Synoptic Survey Telescope in Chile, will use a giant 27.5-foot-wide (8.4-meter-wide) telescope to search for killer asteroids. This telescope is scheduled for completion sometime between 2010 and 2015.


David Morrison, an astronomer at NASA’s Ames Research Center, said that “the rate of discoveries is going to ramp up. We’re going to see discoveries being made at 50 to 100 times the current rate.”

“You can expect asteroids like Apophis [to be found] every month.”

Schweickart, the former astronaut, thinks the United Nations needs to draft a treaty detailing standardized international measures that will be carried out in response to any asteroid threat.

His group, the Association of Space Explorers, has started building a team of scientists, risk specialists, and policymakers to draft such a treaty, which will be submitted to the UN for consideration in 2009.

Two new reports on global security conclude with a growing risk for nuclear terrorism Reuters report today.

The EastWest Institute and Chatham House, the two think-tanks behind the reports, cite that more states are pursuing their own nuclear ambitions and that the materials and engineering effort for a bomb “have all become commodities, more or less available to those determined enough to acquire them”.

The vulnerability of nuclear power plants are mentioned. This is highly relevant considering all the new power plants under planning or construction. Read about the planned terrorist attack on a nuclear power plant in Australia, “Australia nuclear plant plot trial opens in Paris”, Reuters.

But most suprisingly:

Ken Berry, author of the EastWest Institute report, said the rise of environmental militants would bring “an even bigger prospect that scientific personnel from the richest countries will aid eco-terrorist use of nuclear weapons or materials”.

This reminds me of Pentti Linkola, Finnish eco-philosopher and by many considered an eco-fascist. In a Wall Street Journal interview he expresses the view that World War III would be: “a happy occasion for the planet.… If there were a button I could press, I would sacrifice myself without hesitating, if it meant millions of people would die.”

Source: Reuters.

Read the reports; “Preventing Nuclear Terrorism” from EastWest Institute and The CBRN System: Assessing the threat of terrorist use of chemical, biological, radiological and nuclear weapons in the UK from Chatham House (The Royal Institute of International Affairs).

An existential risk is a global catastrophic risk that threatens to exterminate humanity or severely curtail its potential. Existential risks are unique because current institutions have little incentive to mitigate them, except as a side effect of pursuing other goals. There is little to no financial return in mitigating existential risk. Bostrom (2001) argues that because reductions in existential risks are global public goods, they may be undervalued by the market. Also, because we have never confronted a major existential risk before, we have little to learn from, and little impetus to be afraid. For more information, see this reference.

There are three main categories of existential risk — threats from biotechnology, nanotechnology, and AI/robotics. Nuclear proliferation itself is not quite an existential risk, but widespread availability of nuclear weapons could greatly exacerbate future risks, providing a stepping stone into a post-nuclear arms race. We’ll look at that first, then go over the others.

Nuclear risk. The risk of nuclear proliferation is currently high. The United States is planning to spend $100 billion on developing new nuclear weapons, and reports suggest that the President is not doing enough to curtail nuclear proliferation, despite the emphasis on the War on Terror. Syria, Qatar, Egypt, and the United Arab Emirates met to announce they their desire to develop nuclear technology. North Korea successfully tested a nuclear weapon in October. Iran continues enriching uranium against the will of the United Nations, and an Iranian official hinted that the country may be obtaining nuclear weapons. Last night, President Bush used the most confrontational language yet towards Iran, accusing it of directly providing weapons and funds to combatants killing US soldiers. The geopolitical situation today with respect to nuclear technology is probably the worst it has been since the Cold War.

Biotechnological risk. The risk of biotechnological disaster is currently high. An attempt among synthetic life researchers to formulate a common set of ethical standards, at the International Conference on Synthetic Biology, has failed. Among the synthetic biology and biotechnology communities, there is little recognition of the risk of genetically engineered pathogens. President Bush’s plan to spend $7.1 billion on bird flu vaccines was decreased to $2.3 billion by Congress. There is little federal money being spent on research to develop blanket countermeasures against unanticipated biotechnological threats. There are still custom DNA synthesis labs that fill orders without first scanning for harmful sequences. Watch-lists for possible bioweapon sequences are out of date, and far from comprehensive. The cost of lab equipment necessary to make bioweapons has decreased in cost and increased in performance, putting it within the financial reach of terrorist organizations. Until there is more oversight in this area, the risk will not only remain, but increase over time. For more information, see this report.

Nanotechnological risk. The risk of nanotechnological disaster is currently low. Although substantial progress has been made with custom machinery at the nanoscale, there is little effort or money going towards the development of molecular manufacturing, the most dangerous (but also most beneficial) branch of nanotechnology. Although the level of risk today is low, once it begins to escalate, it could do so very rapidly due to the self-replicating nature of molecular manufacturing. Nanotechnology researcher Chris Phoenix has published a paper on how it would be technologically feasible to go from a basic self-replicating assembler to a desktop nanofactory in a matter of weeks. His organization projects the development of nanofactories sometime before 2020. Once desktop nanofactories hit the market, it would be extremely difficult to limit their proliferation, as nanofactories could probably be used to create additional nanofactories very quickly. Unrestricted nanofactories, if made available, could be used to synthesize bombs, biological weapons, or synthetic life that is destructive to the biosphere. Important papers on nanoethics have been published by the Nanoethics Group, the Center for Responsible Nanotechnology, and the Lifeboat Foundation.

Artificial Intelligence risk. The risk from AI and robotics is currently moderate. Because we know so little about how difficult AI is as a problem, we can’t say if it will be developed in 2010 or 2050. Like nanofactories, AI is a threat that could balloon exponentially if it gets out of hand, going from “negligible risk” to “severe risk” practically overnight. There is very little attention given towards the risk of AI and how it should be handled. Some of the only papers published on the topic during 2006 were released by the Singularity Institute for Artificial Intelligence. Just recently, Bill Gates, co-founder of Microsoft, wrote “A Robot in Every Home”, outlining why he thinks robotics will be the next big revolution. There has been increased acceptance, both in academia and the public, for the possibility of AI of human-surpassing intelligence. However, the concept of seed AI continues to be poorly understood and infrequently discussed both in popular and academic discourse.