Toggle light / dark theme

Small steps that can make difference on global catastrophes

Posted in existential risks

Danila Medvedev asked me to make a list of actual projects that can reduce the likelihood of global catastrophe.

EDITED: This list reflects only my personal opinion and not opinion of LF. Suggeted ideas are not final but futher discussion on them is needed. And these ideas are mutual independed.

1. Create the book “Guide to the restoration of civilization”, which describe all the necessary knowledge of hunting, industry, mining, and all the warnings about the risks for the case of civilization collapse.Test its different sections on volunteers. Print the book in stone / metal / other solid media in many copies throughout the world. Bury treasure with the tools / books / seeds in different parts of the world. 1–100 million USD. Reduction of probability of extinction (assuming that real prior probability is 50% in XXI century): 0.1%.
2. Collect money for the work of Singularity Institute in creating a Friendly AI. They need 3 million dollars. This project has a maximum ratio of the cost-impact. That is, it can really increase the chances of survival of humanity at about 1 percent. (This is determined by the product of estimates of the probabilities of events — the possibility of AI, what SIAI will solve this problem, the fact that it chooses the problem first, and that it solves the problem of friendliness, and the fact that the money they have will be enough.)
3. Krisave in the ice of Antarctica (the temperature of −57 C, in addition, you can create a stable region of lower temperature by use of liquid nitrogen which would be pumped and cooled it) a few people, so that if on earth there another advanced civilization, it could revive them. cost is several million dollars. Another project on the preservation of human knowledge in the spirit of the proposed fund by LongNow titanium discs with recorded information.
4. Send human DNA on the moon in the stable time capsule. Several tens of millions of dollars. You can also send the criopreserved human brain. The idea here is that if mankind would perish, then someday the aliens arrive and revive people based on these data. Expenses is 20–50 million dollars, the probability of success of 0.001%. Send human DNA in space in other ways.
5. Accelerated development of universal vaccines. Creation of the world’s reserves of powerful means of decontamination in the event of a global epidemic, the stockpiling antvirus drugs and vaccines to the majority of known viruses, which would be enough for a large part of humanity. Establishment of virus monitoring and instant diagnosis (test strips). Creation and production of many billions of pieces of advanced disinfecting tools such as personal UV lamps, nanotech dressing for the face, gloves, etc. The billions or hundreds of billions of dollars a year. Creating personal stockpiles of food and water at each house for a month. Development of supply system with no contact of people with one another. Jump to slow global transport (ships) in the event of a pandemic. Training of medical personnel and the creation of spare beds in hospitals. Creating and testing on real problems huge factories, which in a few weeks can develop and produce billions of doses of vaccines. Improvement of legislation in the field of quarantine. There are also risks. Increase the probability of survival 2–3 percent.
6. Creating a self-contained bunker with a supply of food for several decades and with the constant “crews”, able to restore humanity. About $ 1 billion. Save those types of resources that humanity could use the post-apocalyptic stage for recovery.
7. The creation of scientific court for Hadron Collider and other potentially dangerous projects, in which the theoretical physicist will be paid large sums of money for the discovery of potential vulnerabilities.
8. Adaptation of the ISS function for bunker in case of disasters on Earth — the creation of the ISS series of additional modules, which may support the existence of the crew for 10 years. Cost is tens of billions of dollars.
9. Creation of an autonomous self-sustaining base on the Moon. At the present level of technology — about $ 1 trillion or more. Proper development of strategy of space exploration would cheapen it — that is, investments in new types of engines and cheap means of delivery. Increase survival by 1 percent. (But there are also new risks).
10. The same is true on Mars. Several trillion. Increase survival of 1–2 per cent.
11. Creating star nuclear Ark ship- — tens of trillions of dollars. Increase survival of 1–2 per cent.
12. (The following are items for which are not enough money, but political will is also needed.) Destruction of rogue states and the establishment of a world state. 10 percent increase in survival. However, the high risks in the process.
13. Creating a global center for rapid response to global risks. Something like Special Forces or the Ministry of Emergency Situations, which can throw on the global risks. Enable it to instant action, including the hostilities, as well as intelligence. Giving its veto on the dangerous experiments. Strengthening of civil defense in the field.
14. The ban on private science (in the sense in the garage) and the creation of several centers of certified science (science town with centralized control of security in the process) with a high level of funding of breakthrough research. In the field of biotechnology, nuclear technology, artificial intelligence and nano. This will help prevent the dissemination of knowledge of mass destruction, but it will not stop progress. It is only after the abolition of nation states. A few percent increase in survival. These science towns can freely exchange technical information between themselves, but do not have the right to release it into the outside world.
15. The legislation required the duplication of a vital resource and activities — which would make impossible the collapse of civilization in a domino effect on failure at one point. The ban on super complex system of social organization, whose behavior is unpredictable and too prone to a domino effect, and replace them on the linear repetitive production system — that is, opposition to economic globalization.
16. Certification and licensing researchers in bio, nano, AI and nuclear technologies. Legislative requirement to check all their own and others’ inventions for the global risks associated with them, and the commitment to develop both a means of protection in the event of their inventions go out of control.
17. Law on raising intelligence of people half the population of fertilization from a few hundred of the best fathers in terms of intelligence and common sense and dislike of the risks. (Second half of the breed in the usual manner to maintain genetic diversity, the project is implemented without violence due to cash payments.) Plus education reform, where the school is replaced by a system of training, which given the important role of good sense and knowledge of logic.
18. Limitation of capitalist competition as the engine of the economy, because it leads to an underestimation of risk in the long term.
19. Leading investment in the field like nanotechnology breakthrough in the best and most critical facilities, to quickly slip dangerous period.
20. The growth of systems of information control and surveillance of the total, plus the certification data in them, and pattern recognition. Control of the Internet and the personal authorization for network logons. Continuous monitoring of all persons who possess potentially dangerous knowledge.
This could be creating a global think tank from the best experts on global risks and the formulation of their objectives to develop a positive scenario. Thus it is necessary to understand which way to combine these specialists would be most effective, so A) they do not eat each other because of different ideas and feelings of their own importance. B) that it does not become money feedbox. B) but that they received money for it, which would allow them to concentrate fully on this issue. That is, it should be something like edited journal, wiki, scientific trial or predictions market. But the way of association should not be too exotic, as well as exotic ways should be tested on less important matters.
However, the creation of accurate and credible for all models of the global risk would reduce by at least twice the probability of global catastrophe. And we are still at the stage of creating such a model. Therefore, how to create models and ways of authentication are now the most important, though, may have already been lost.
I emphasize that the main problems of global risks lies within the scope of knowledge, rather than to the sphere of action. That is the main problem that we do not know where we should prepare, not that we do not have instrument of defence. Risks are removed by the knowledge and expertise.
Implementation of these measures is technically and economically possible and could reduce the chance of extinction in the XXI century, in my estimation, 10 times.

Any ideas or missed projects?

14 Comments so far

  1. If you are serious about #14 then you are my sworn enemy. The general statist tone is a disaster. Governments have killed more of their own people in non-wars (not to mention wars) than anything else. Presuming that a single government would be less of a danger is utterly unjustified. Presuming that any official group is bright and smart and benevolent enough to determine forever more what science should and should not be done is utterly unjustified. I would consider much of the above a clear danger to species survival.

  2. Samanta, I wrote about ban of private science with pain in my heart. I dont like goverments, I dont like bans, but I think that this could increase the probability of survival in short dangerouse period od next several decedes.

  3. We really must escape this mass delusion that greater political control equals greater safety; a casual understanding of political history and economics reveals our trust in totalitarianism (which you ARE supporting) to be sorely misplaced.

    It should be obvious, but you must understand that by illegalizing something you don’t stop it from happening; you just deliver it into the hands of criminals.

    How many stories have you heard about the Budweiser company killing people on the street? None, right? Yet Budweiser provides and equivalent service to Al Capone.

    How much of the crime that now exists in the U.S. is related to drug laws?

    Some economist estimate that the black market accounts for 15–20 percent of the world economy.

    Do you really think you can control and regulate people to such a degree that nobody will ever be able to escaper your grasp?

    Do you really believe that a worldwide totalitarian state will allow humanity a greater chance at survival, when all of history contradicts you?

    Do you really feel that science and economic development is better in the hands of dictators and rulers?

    Science is crushed and warped under the state; crime and lawlessness is rampant in controlled societies (when you make everything illegal, it is very hard to find the real troublemakers; just look at illegal immigration.)

    At least “capitalist competition” provides some natural protections from corruption. Under your “safe” society, corruption would become a way of life, and the only means of survival (as it has in countless places throughout history.)

    Quite simply, you quest for control and security will fail miserably if brought to fruition; and ironically, will likely bring about the opposite of what you seek.

    I’m sorry, but this article is historically ignorant, economically invalid, morally repugnant, and on top of that, just plain silly.

    I’m afraid can no longer support the Lifeboat Foundation, in any way, shape, or form.

  4. This article is my own position and is not a position of LF. The thesis in it is suggestion for discussion, but not final conclusion.

    But I think you don’t understand the main idea: the sciense should not be banned. It should be opened to everyone, but in safe places.

    Look for example on driving — it is open to everyone, but after passing exams.

    Or look on laws on toy missle constructions in US. Everyone can start learn how to built the missle, but to actually lunch it you need permission, and you could do it only after several years of learning. And in order to start bigger one uou must have experience with small ones.

    And no body have right to start chain reaction on uranium at his home, but everybody could go to the university and learn and work in the safe laboratory.

    How can we prevent situation there one bioterrorist could kill half of human population without system of global control?

  5. Substantially more private consideration before public posting here at Lifeboat Foundation might well have greatly improved the piece. The author might have ranked the various proposals on three axes: financial cost, expected helpfulness, and opportunity cost. This would have given readers a way to read each proposal more or less independently of the set. As written, the reader can’t tell which points the author thinks most important by any criterion.

    If a system of restricting and limiting information for safety’s sake makes global sense, then I suggest an editorial system of reviewing and revising information for your readers’ sakes makes local sense. Inasmuch as it represents a private individual’s position, why does L.F. post it here at all?

    I expect much better from this organization.

  6. Thank you for getting this important discussion (re)started.

    Commenters, we all need to respect the author’s preface that his ideas are intended for public evaluation (think white board, folks, or initial brainstorming) rather than as policies he advocates for implementation. Your disagreement with one or more specific items, even if valid, doesn’t negate the value of this timely discussion. Those of us who have children should think of this as their future at stake.

    Second, one addition as to risks to address- Applying Nick Bostrom’s definition of existential risk, one which “threatens to cause the extinction of earth-originating intelligent life or to reduce its quality of life… permanently and drastically”, to the broader global catastrophic risks category, I would add sustained political repression as a very real danger. This scenario could have a 20% or higher probability in the decades ahead. Most important, this may also be a risk which timely, thoughtful discussion and planning can substantially reduce, either by prevention or mitigation (i.e. David Brin’s Transparent Society as one simplified example).

    For further discussion see my postings at http://www.sustainablerights.blogspot.com.

  7. Thank you for understanding, Richard!

    We could have democratic society with strict laws about dangerouse activities. Indeed, dictatorships are often coorupted states where laws could be altered by bribery.

    So I argued for global democrtaicaly elected and controlled goverment, not autoritarian power.

    Also I argued not for banning science but for putting it in safe places, which will help to grow good projects — and this would lead to even quicker development of science than today.

    But a lot of negative reaction that I encounter showed me that most likly such plans would not be realised.

  8. I think that there are valuable points on both sides of the debate. I have never considered the positions posted on this blog to be anything but that of the author. It is important to have free-style posts where people can throw out their ideas and solicit positive an negative critiques. This is necessary if good ideas are to have a hearing, influence the thinking of others, and be improved upon in the process. Please don’t change the nature of this blog!

    On the other hand, Richard brings up a good point by essentially saying that there needs to be some sort of editorial structure so that there can be positions which are representative of the Lifeboat Foundation. I view many of the pages of the LF website as being (somewhat) the official positions of the LF. Unfortunately, those pages do no allow for feedback from lay commenters like myself. Perhaps there could be another part of the website established where official “papers” could be published and feedback allowed.

    This leads me to one other idea. The LF has assembled a very impressive collections of experts into its various boards. My presumption has been that this blog only has contributions from its official bloggers or perhaps also from any of its many board members. But these people are generally professionals who have credentials justifying their having been invited to be on a board. Also, my feeling is that the frequency of posting on this blog to rare. e.g. There have only been five articles in the last 53 days.

    I myself would love to submit articles for consideration and there might be other people would would like to as well. Can we do that? That would be pretty cool!

    P.S. Incidentally, so far Alexei, you are my favorite blogger here. Your posts have addressed practical ways (e.g. bunkers) of addressing existential risks. It seems as though you are really searching for solutions and not just addressing the issue in an “academic” mindset.

  9. “I myself would love to submit articles for consideration and there might be other people who would like to as well. Can we do that? That would be pretty cool!”

    Sure, send us an email with the subject “Lifeboat Foundation John Hunt” at [email protected] and we’ll give you author access to our blog, etc.

  10. Hi Eric,

    I sent the email with that subject several days back but have not received any word back. Thanks. John

  11. Hi John,

    “I sent the email with that subject several days back but have not received any word back.”

    We have now resent our response. Check your spam folder…

  12. Alexei writes: So I argued for global democrtaicaly elected and controlled goverment, not autoritarian power.

    Alas, “democracy is the worst form of government, except for all of the other forms that have been tried” (Winston Churchill). I suspect a better system is out there waiting to be found. Undoubtedly it will be democratic in important ways, but it might not be democratic in the same way leading democracies are now.

Leave a Reply